Oct 03 12:49:50 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 12:49:50 crc restorecon[4735]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:50 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 12:49:51 crc restorecon[4735]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 12:49:52 crc kubenswrapper[4962]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 12:49:52 crc kubenswrapper[4962]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 12:49:52 crc kubenswrapper[4962]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 12:49:52 crc kubenswrapper[4962]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 12:49:52 crc kubenswrapper[4962]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 12:49:52 crc kubenswrapper[4962]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.012623 4962 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015456 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015477 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015483 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015488 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015492 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015496 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015509 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015513 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015518 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015523 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015527 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015532 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015535 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015539 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015543 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015546 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015550 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015553 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015557 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015560 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015564 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015568 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015572 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015575 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015580 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015584 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015588 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015592 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015595 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015599 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015602 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015606 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015610 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015614 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015617 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015622 4962 feature_gate.go:330] unrecognized feature gate: Example Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015626 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015645 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015650 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015654 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015658 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015669 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015674 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015678 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015681 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015685 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015689 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015692 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015696 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015699 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015703 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015706 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015710 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015714 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015718 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015721 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015724 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015728 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015732 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015735 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015739 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015742 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015745 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015749 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015752 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015755 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015759 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015763 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015766 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015772 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.015777 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017373 4962 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017396 4962 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017409 4962 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017417 4962 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017425 4962 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017430 4962 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017438 4962 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017444 4962 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017450 4962 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017455 4962 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017460 4962 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017465 4962 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017472 4962 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017478 4962 flags.go:64] FLAG: --cgroup-root="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017483 4962 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017488 4962 flags.go:64] FLAG: --client-ca-file="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017493 4962 flags.go:64] FLAG: --cloud-config="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017499 4962 flags.go:64] FLAG: --cloud-provider="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017504 4962 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017511 4962 flags.go:64] FLAG: --cluster-domain="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017516 4962 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017521 4962 flags.go:64] FLAG: --config-dir="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017527 4962 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017532 4962 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017541 4962 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017546 4962 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017551 4962 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017557 4962 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017563 4962 flags.go:64] FLAG: --contention-profiling="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017569 4962 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017574 4962 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017580 4962 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017587 4962 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017593 4962 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017598 4962 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017602 4962 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017606 4962 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017610 4962 flags.go:64] FLAG: --enable-server="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017615 4962 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017620 4962 flags.go:64] FLAG: --event-burst="100" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017625 4962 flags.go:64] FLAG: --event-qps="50" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017630 4962 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017653 4962 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017659 4962 flags.go:64] FLAG: --eviction-hard="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017665 4962 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017670 4962 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017674 4962 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017679 4962 flags.go:64] FLAG: --eviction-soft="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017683 4962 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017687 4962 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017691 4962 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017695 4962 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017700 4962 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017704 4962 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017709 4962 flags.go:64] FLAG: --feature-gates="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017714 4962 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017719 4962 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017723 4962 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017728 4962 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017732 4962 flags.go:64] FLAG: --healthz-port="10248" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017737 4962 flags.go:64] FLAG: --help="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017741 4962 flags.go:64] FLAG: --hostname-override="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017745 4962 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017749 4962 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017753 4962 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017757 4962 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017761 4962 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017765 4962 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017770 4962 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017774 4962 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017778 4962 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017782 4962 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017786 4962 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017790 4962 flags.go:64] FLAG: --kube-reserved="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017795 4962 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017799 4962 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017803 4962 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017807 4962 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017811 4962 flags.go:64] FLAG: --lock-file="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017815 4962 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017820 4962 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017824 4962 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017830 4962 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017834 4962 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017838 4962 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017842 4962 flags.go:64] FLAG: --logging-format="text" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017846 4962 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017850 4962 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017854 4962 flags.go:64] FLAG: --manifest-url="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017858 4962 flags.go:64] FLAG: --manifest-url-header="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017864 4962 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017868 4962 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017873 4962 flags.go:64] FLAG: --max-pods="110" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017878 4962 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017884 4962 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017888 4962 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017895 4962 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017901 4962 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017907 4962 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017913 4962 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017930 4962 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017935 4962 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017941 4962 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017947 4962 flags.go:64] FLAG: --pod-cidr="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017952 4962 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017959 4962 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017963 4962 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017968 4962 flags.go:64] FLAG: --pods-per-core="0" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017972 4962 flags.go:64] FLAG: --port="10250" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017976 4962 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017981 4962 flags.go:64] FLAG: --provider-id="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017985 4962 flags.go:64] FLAG: --qos-reserved="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017989 4962 flags.go:64] FLAG: --read-only-port="10255" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017993 4962 flags.go:64] FLAG: --register-node="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.017997 4962 flags.go:64] FLAG: --register-schedulable="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018001 4962 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018015 4962 flags.go:64] FLAG: --registry-burst="10" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018019 4962 flags.go:64] FLAG: --registry-qps="5" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018024 4962 flags.go:64] FLAG: --reserved-cpus="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018028 4962 flags.go:64] FLAG: --reserved-memory="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018033 4962 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018037 4962 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018041 4962 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018045 4962 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018050 4962 flags.go:64] FLAG: --runonce="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018054 4962 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018058 4962 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018062 4962 flags.go:64] FLAG: --seccomp-default="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018066 4962 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018070 4962 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018074 4962 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018079 4962 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018083 4962 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018087 4962 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018091 4962 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018095 4962 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018099 4962 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018104 4962 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018109 4962 flags.go:64] FLAG: --system-cgroups="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018113 4962 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018120 4962 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018124 4962 flags.go:64] FLAG: --tls-cert-file="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018165 4962 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018171 4962 flags.go:64] FLAG: --tls-min-version="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018175 4962 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018179 4962 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018183 4962 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018187 4962 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018191 4962 flags.go:64] FLAG: --v="2" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018197 4962 flags.go:64] FLAG: --version="false" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018203 4962 flags.go:64] FLAG: --vmodule="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018208 4962 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.018213 4962 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018317 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018323 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018327 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018332 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018336 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018339 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018343 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018347 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018350 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018354 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018357 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018362 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018367 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018371 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018375 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018379 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018383 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018388 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018391 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018395 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018398 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018402 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018406 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018410 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018413 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018417 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018420 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018424 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018427 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018431 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018434 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018437 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018442 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018446 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018450 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018454 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018457 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018461 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018464 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018467 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018471 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018475 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018478 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018482 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018485 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018489 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018492 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018496 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018499 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018503 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018506 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018509 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018513 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018522 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018526 4962 feature_gate.go:330] unrecognized feature gate: Example Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018529 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018533 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018536 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018541 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018546 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018550 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018555 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018559 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018566 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018570 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018574 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018577 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018581 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018584 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018588 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.018591 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.019312 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.026738 4962 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.026763 4962 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026819 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026827 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026831 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026836 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026840 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026845 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026849 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026854 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026859 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026863 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026868 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026873 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026877 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026881 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026885 4962 feature_gate.go:330] unrecognized feature gate: Example Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026890 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026895 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026900 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026904 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026908 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026914 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026921 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026926 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026929 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026933 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026938 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026941 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026945 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026949 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026952 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026956 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026960 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026963 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026968 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026973 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026977 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026981 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026984 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026988 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026992 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.026996 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027000 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027004 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027008 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027012 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027019 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027023 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027027 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027031 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027035 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027039 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027043 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027046 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027050 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027053 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027057 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027060 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027064 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027067 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027071 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027074 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027077 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027081 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027085 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027089 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027092 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027095 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027099 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027103 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027106 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027110 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.027117 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027255 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027266 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027271 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027275 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027280 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027285 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027289 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027295 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027302 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027306 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027311 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027315 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027320 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027324 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027328 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027332 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027336 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027340 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027344 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027347 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027351 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027354 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027359 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027363 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027367 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027370 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027374 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027378 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027381 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027385 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027388 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027392 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027395 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027400 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027405 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027408 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027412 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027415 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027419 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027423 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027426 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027430 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027433 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027437 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027441 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027444 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027448 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027451 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027455 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027458 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027462 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027465 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027469 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027473 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027477 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027481 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027486 4962 feature_gate.go:330] unrecognized feature gate: Example Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027490 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027494 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027498 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027501 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027506 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027511 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027515 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027520 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027524 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027528 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027531 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027536 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027539 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.027544 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.027550 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.028445 4962 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.031710 4962 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.031785 4962 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.033011 4962 server.go:997] "Starting client certificate rotation" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.033045 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.033223 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-28 07:37:06.520317455 +0000 UTC Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.033301 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2058h47m14.487020554s for next certificate rotation Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.067393 4962 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.069153 4962 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.090565 4962 log.go:25] "Validated CRI v1 runtime API" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.124294 4962 log.go:25] "Validated CRI v1 image API" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.125925 4962 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.132459 4962 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-12-45-07-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.132508 4962 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.152156 4962 manager.go:217] Machine: {Timestamp:2025-10-03 12:49:52.149823081 +0000 UTC m=+0.553720926 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:16e8121c-ac81-46e8-9c72-10e496aaa780 BootID:8f831723-ac1f-49ed-8733-e30832d406d9 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a4:75:e0 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a4:75:e0 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:61:04:33 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d8:1f:7c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:bb:30:76 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:13:2b:3e Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:8a:b5:9d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:92:be:e1:7f:f1:64 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:46:2b:80:a7:df:8e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.152403 4962 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.152550 4962 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.153693 4962 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.153900 4962 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.153941 4962 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.155226 4962 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.155249 4962 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.155630 4962 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.155674 4962 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.155911 4962 state_mem.go:36] "Initialized new in-memory state store" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.155991 4962 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.161491 4962 kubelet.go:418] "Attempting to sync node with API server" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.161512 4962 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.161527 4962 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.161539 4962 kubelet.go:324] "Adding apiserver pod source" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.161549 4962 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.167041 4962 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.168309 4962 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.169268 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.169355 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.169268 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.169396 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.169556 4962 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172872 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172907 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172917 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172928 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172943 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172951 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172962 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172979 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172989 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.172998 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.173010 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.173019 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.174807 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.175308 4962 server.go:1280] "Started kubelet" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.175550 4962 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.175767 4962 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.175818 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.176187 4962 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 12:49:52 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.178109 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.178137 4962 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.178304 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 16:52:18.658241694 +0000 UTC Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.178329 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2236h2m26.479913828s for next certificate rotation Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.178698 4962 server.go:460] "Adding debug handlers to kubelet server" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.183533 4962 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.183740 4962 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.185401 4962 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.184466 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.183757 4962 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.185653 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.185351 4962 factory.go:55] Registering systemd factory Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.185693 4962 factory.go:221] Registration of the systemd container factory successfully Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.185712 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.172:6443: connect: connection refused" interval="200ms" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.185933 4962 factory.go:153] Registering CRI-O factory Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.185952 4962 factory.go:221] Registration of the crio container factory successfully Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.186151 4962 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.186182 4962 factory.go:103] Registering Raw factory Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.186197 4962 manager.go:1196] Started watching for new ooms in manager Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.190469 4962 manager.go:319] Starting recovery of all containers Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.189814 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.172:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186afc19cc66e3e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 12:49:52.17527703 +0000 UTC m=+0.579174885,LastTimestamp:2025-10-03 12:49:52.17527703 +0000 UTC m=+0.579174885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198785 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198831 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198846 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198858 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198873 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198884 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198894 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198902 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198924 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198943 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198957 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198969 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198980 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.198993 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199009 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199027 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199038 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199048 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199058 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199092 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199101 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199111 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199118 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.199128 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.201664 4962 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.201796 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.203371 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.203517 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.203612 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.203785 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.203874 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.204061 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.204148 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.204232 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.204334 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.204873 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.205720 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.205839 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.205930 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206007 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206102 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206189 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206278 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206409 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206496 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206620 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206744 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206828 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206897 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.206991 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207085 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207169 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207255 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207341 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207416 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207488 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207566 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207629 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207772 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207847 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.207922 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.208017 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.208100 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.208195 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.210452 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.210778 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211713 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211738 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211749 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211761 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211774 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211786 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211798 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211811 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211824 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211859 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211871 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211886 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211899 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211911 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211924 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211935 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211948 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211969 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211982 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.211995 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212008 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212022 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212035 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212049 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212063 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212076 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212091 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212108 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212122 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212137 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212150 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212164 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212180 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212193 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212208 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212224 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212231 4962 manager.go:324] Recovery completed Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212237 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212254 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212269 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212290 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212303 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212315 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212327 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212337 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212350 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212366 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212381 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212394 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212406 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212416 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212426 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212436 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212449 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212461 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212475 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212487 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212500 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212511 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212522 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212535 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212549 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212561 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212572 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212586 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212598 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212608 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212667 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212684 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212696 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212708 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212719 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212732 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212743 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212756 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212766 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212786 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212799 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212811 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212823 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212837 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212849 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212862 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212875 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212888 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212899 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212911 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212923 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212935 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212947 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212963 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212974 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212987 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.212999 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213011 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213024 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213036 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213048 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213060 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213073 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213085 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213097 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213110 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213122 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213135 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213147 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213179 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213192 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213206 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213218 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213230 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213242 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213255 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213268 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213281 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213292 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213304 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213317 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213331 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213343 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213354 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213365 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213378 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213395 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213407 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213419 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213430 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213440 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213451 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213461 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213473 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213484 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213492 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213500 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213509 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213517 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213525 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213534 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213543 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213551 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213560 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213584 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213597 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213607 4962 reconstruct.go:97] "Volume reconstruction finished" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.213613 4962 reconciler.go:26] "Reconciler: start to sync state" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.221225 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.222440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.222536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.222591 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.223298 4962 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.223310 4962 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.223325 4962 state_mem.go:36] "Initialized new in-memory state store" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.223428 4962 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.225769 4962 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.225844 4962 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.225876 4962 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.225938 4962 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.227477 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.227528 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.239920 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.172:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186afc19cc66e3e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 12:49:52.17527703 +0000 UTC m=+0.579174885,LastTimestamp:2025-10-03 12:49:52.17527703 +0000 UTC m=+0.579174885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.251186 4962 policy_none.go:49] "None policy: Start" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.252345 4962 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.252375 4962 state_mem.go:35] "Initializing new in-memory state store" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.284819 4962 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.307814 4962 manager.go:334] "Starting Device Plugin manager" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.308042 4962 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.308076 4962 server.go:79] "Starting device plugin registration server" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.308451 4962 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.308467 4962 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.308860 4962 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.310480 4962 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.310493 4962 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.314201 4962 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.326882 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.327006 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.328166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.328217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.328227 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.328387 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.328564 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.328619 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329385 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329509 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329543 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.329994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.330015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.330028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.330161 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.330306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.330327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.330336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.330584 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.330604 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331171 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331807 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.331982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.332118 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.332139 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.332537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.332561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.332571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.332839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.332856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.332864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.387011 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.172:6443: connect: connection refused" interval="400ms" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.408630 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.409866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.409900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.409912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.409972 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.410406 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.172:6443: connect: connection refused" node="crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417333 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417382 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417402 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417422 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417440 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417460 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417475 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417503 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417546 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417606 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.417737 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518704 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518748 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518777 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518847 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518870 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518879 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518897 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518936 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518953 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518955 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518953 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519183 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519189 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518986 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519221 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518985 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518954 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519222 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.518983 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519238 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519235 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.519368 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.610715 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.611813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.611891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.611945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.612036 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.612430 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.172:6443: connect: connection refused" node="crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.670764 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.679366 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.693104 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.708302 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.713365 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-04bf6a20c05efe1a6b0d2f5d34e4d6450c6cab2a84456f0375bd9e2b80f9ff98 WatchSource:0}: Error finding container 04bf6a20c05efe1a6b0d2f5d34e4d6450c6cab2a84456f0375bd9e2b80f9ff98: Status 404 returned error can't find the container with id 04bf6a20c05efe1a6b0d2f5d34e4d6450c6cab2a84456f0375bd9e2b80f9ff98 Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.714182 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-84255f8b076e2f291343139ff3762320be19fa7f7bf8a51077f39ff49a8a470a WatchSource:0}: Error finding container 84255f8b076e2f291343139ff3762320be19fa7f7bf8a51077f39ff49a8a470a: Status 404 returned error can't find the container with id 84255f8b076e2f291343139ff3762320be19fa7f7bf8a51077f39ff49a8a470a Oct 03 12:49:52 crc kubenswrapper[4962]: I1003 12:49:52.715627 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.720145 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-86f4aa47355638d420ae0e187f5bed77781505e593536cdb5db1900b32a3b425 WatchSource:0}: Error finding container 86f4aa47355638d420ae0e187f5bed77781505e593536cdb5db1900b32a3b425: Status 404 returned error can't find the container with id 86f4aa47355638d420ae0e187f5bed77781505e593536cdb5db1900b32a3b425 Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.724563 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f115ec908d114371852fd8f989d58d5525fa02d1f6fe96b5e65b8927c1982b36 WatchSource:0}: Error finding container f115ec908d114371852fd8f989d58d5525fa02d1f6fe96b5e65b8927c1982b36: Status 404 returned error can't find the container with id f115ec908d114371852fd8f989d58d5525fa02d1f6fe96b5e65b8927c1982b36 Oct 03 12:49:52 crc kubenswrapper[4962]: W1003 12:49:52.726174 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-32e8f917b95995f743fdb1372093c942bac9f18f3265c356bff1097c295844ff WatchSource:0}: Error finding container 32e8f917b95995f743fdb1372093c942bac9f18f3265c356bff1097c295844ff: Status 404 returned error can't find the container with id 32e8f917b95995f743fdb1372093c942bac9f18f3265c356bff1097c295844ff Oct 03 12:49:52 crc kubenswrapper[4962]: E1003 12:49:52.787847 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.172:6443: connect: connection refused" interval="800ms" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.013201 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.014875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.014911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.014924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.014948 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 12:49:53 crc kubenswrapper[4962]: E1003 12:49:53.015417 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.172:6443: connect: connection refused" node="crc" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.177650 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.229461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"84255f8b076e2f291343139ff3762320be19fa7f7bf8a51077f39ff49a8a470a"} Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.230253 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"04bf6a20c05efe1a6b0d2f5d34e4d6450c6cab2a84456f0375bd9e2b80f9ff98"} Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.231362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"32e8f917b95995f743fdb1372093c942bac9f18f3265c356bff1097c295844ff"} Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.235206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f115ec908d114371852fd8f989d58d5525fa02d1f6fe96b5e65b8927c1982b36"} Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.241877 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"86f4aa47355638d420ae0e187f5bed77781505e593536cdb5db1900b32a3b425"} Oct 03 12:49:53 crc kubenswrapper[4962]: W1003 12:49:53.381094 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:53 crc kubenswrapper[4962]: E1003 12:49:53.381157 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:53 crc kubenswrapper[4962]: W1003 12:49:53.437984 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:53 crc kubenswrapper[4962]: E1003 12:49:53.438261 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:53 crc kubenswrapper[4962]: W1003 12:49:53.563920 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:53 crc kubenswrapper[4962]: E1003 12:49:53.564001 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:53 crc kubenswrapper[4962]: E1003 12:49:53.588952 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.172:6443: connect: connection refused" interval="1.6s" Oct 03 12:49:53 crc kubenswrapper[4962]: W1003 12:49:53.617814 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:53 crc kubenswrapper[4962]: E1003 12:49:53.617892 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.815503 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.817191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.817227 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.817239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:53 crc kubenswrapper[4962]: I1003 12:49:53.817261 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 12:49:53 crc kubenswrapper[4962]: E1003 12:49:53.817739 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.172:6443: connect: connection refused" node="crc" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.177389 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.245421 4962 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642" exitCode=0 Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.245495 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642"} Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.245568 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.246879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.246921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.246938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.248587 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6"} Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.248632 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9"} Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.248679 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320"} Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.250025 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d" exitCode=0 Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.250078 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d"} Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.250114 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.250914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.250941 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.250955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.251906 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e" exitCode=0 Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.252067 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.252222 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.252274 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e"} Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.252889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.252919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.252935 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.253023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.253043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.253052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.253409 4962 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a0f0e7ddaee0852f8955a31ea974c460c5af2dae4ea15f6fa68d65fe9f0d63e1" exitCode=0 Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.253443 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a0f0e7ddaee0852f8955a31ea974c460c5af2dae4ea15f6fa68d65fe9f0d63e1"} Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.253499 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.254590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.254658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:54 crc kubenswrapper[4962]: I1003 12:49:54.254682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.177081 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:55 crc kubenswrapper[4962]: E1003 12:49:55.190453 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.172:6443: connect: connection refused" interval="3.2s" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.258181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937"} Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.258222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea"} Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.260259 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9" exitCode=0 Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.260342 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9"} Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.260369 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.261253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.261284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.261295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.261756 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8ad2b9fc4c53924d160f9bcbf329977550ed3f9e5724a7930bc19b137f412208"} Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.261794 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.262587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.262612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.262621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.271383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9"} Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.271426 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893"} Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.287028 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8"} Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.287133 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.288352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.288390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.288402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:55 crc kubenswrapper[4962]: W1003 12:49:55.329338 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:55 crc kubenswrapper[4962]: E1003 12:49:55.329408 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:55 crc kubenswrapper[4962]: W1003 12:49:55.362827 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:55 crc kubenswrapper[4962]: E1003 12:49:55.362947 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.418602 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.420402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.420440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.420452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:55 crc kubenswrapper[4962]: I1003 12:49:55.420484 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 12:49:55 crc kubenswrapper[4962]: E1003 12:49:55.421852 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.172:6443: connect: connection refused" node="crc" Oct 03 12:49:56 crc kubenswrapper[4962]: W1003 12:49:56.174214 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:56 crc kubenswrapper[4962]: E1003 12:49:56.176304 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.172:6443: connect: connection refused" logger="UnhandledError" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.177194 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.172:6443: connect: connection refused Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.291499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d"} Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.291625 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.292659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.292704 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.292722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.294934 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782"} Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.294965 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9"} Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.294978 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8"} Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.294989 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.295710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.295737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.295746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.297191 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114" exitCode=0 Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.297248 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114"} Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.297266 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.297334 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.297675 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.301753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.301783 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.301794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.302413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.302436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.302455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.302456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.302464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:56 crc kubenswrapper[4962]: I1003 12:49:56.302473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.229803 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303131 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861"} Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303166 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303187 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303185 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6"} Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303166 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303287 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8"} Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303422 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303441 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c"} Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303461 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303473 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3"} Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.303190 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.307740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.307803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.308989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:57 crc kubenswrapper[4962]: I1003 12:49:57.381231 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.305593 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.305624 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.305773 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306588 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.306906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.622003 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.623256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.623292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.623305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.623330 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.746425 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.746575 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.747839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.747880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.747893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:58 crc kubenswrapper[4962]: I1003 12:49:58.879899 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:49:59 crc kubenswrapper[4962]: I1003 12:49:59.307302 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:59 crc kubenswrapper[4962]: I1003 12:49:59.307302 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:49:59 crc kubenswrapper[4962]: I1003 12:49:59.308204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:59 crc kubenswrapper[4962]: I1003 12:49:59.308244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:59 crc kubenswrapper[4962]: I1003 12:49:59.308261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:49:59 crc kubenswrapper[4962]: I1003 12:49:59.309133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:49:59 crc kubenswrapper[4962]: I1003 12:49:59.309168 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:49:59 crc kubenswrapper[4962]: I1003 12:49:59.309185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:00 crc kubenswrapper[4962]: I1003 12:50:00.230397 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 12:50:00 crc kubenswrapper[4962]: I1003 12:50:00.230486 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 12:50:00 crc kubenswrapper[4962]: I1003 12:50:00.807568 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:50:00 crc kubenswrapper[4962]: I1003 12:50:00.807741 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:00 crc kubenswrapper[4962]: I1003 12:50:00.808749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:00 crc kubenswrapper[4962]: I1003 12:50:00.808780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:00 crc kubenswrapper[4962]: I1003 12:50:00.808793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:01 crc kubenswrapper[4962]: I1003 12:50:01.862227 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:50:01 crc kubenswrapper[4962]: I1003 12:50:01.862371 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:01 crc kubenswrapper[4962]: I1003 12:50:01.863244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:01 crc kubenswrapper[4962]: I1003 12:50:01.863277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:01 crc kubenswrapper[4962]: I1003 12:50:01.863286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:02 crc kubenswrapper[4962]: E1003 12:50:02.314287 4962 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 12:50:02 crc kubenswrapper[4962]: I1003 12:50:02.842417 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 12:50:02 crc kubenswrapper[4962]: I1003 12:50:02.842591 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:02 crc kubenswrapper[4962]: I1003 12:50:02.843541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:02 crc kubenswrapper[4962]: I1003 12:50:02.843570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:02 crc kubenswrapper[4962]: I1003 12:50:02.843582 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.287691 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.287830 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.289461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.289490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.289500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.292651 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.316523 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.317658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.317697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.317707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:03 crc kubenswrapper[4962]: I1003 12:50:03.319589 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:50:04 crc kubenswrapper[4962]: I1003 12:50:04.318442 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:04 crc kubenswrapper[4962]: I1003 12:50:04.320113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:04 crc kubenswrapper[4962]: I1003 12:50:04.320243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:04 crc kubenswrapper[4962]: I1003 12:50:04.320337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:06 crc kubenswrapper[4962]: W1003 12:50:06.439333 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 03 12:50:06 crc kubenswrapper[4962]: I1003 12:50:06.439418 4962 trace.go:236] Trace[276311036]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 12:49:56.437) (total time: 10001ms): Oct 03 12:50:06 crc kubenswrapper[4962]: Trace[276311036]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:50:06.439) Oct 03 12:50:06 crc kubenswrapper[4962]: Trace[276311036]: [10.001514381s] [10.001514381s] END Oct 03 12:50:06 crc kubenswrapper[4962]: E1003 12:50:06.439441 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.096858 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.096923 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.104324 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.104412 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.330614 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.332798 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782" exitCode=255 Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.332840 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782"} Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.332995 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.333784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.333813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.333822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:07 crc kubenswrapper[4962]: I1003 12:50:07.334230 4962 scope.go:117] "RemoveContainer" containerID="22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782" Oct 03 12:50:08 crc kubenswrapper[4962]: I1003 12:50:08.336444 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 12:50:08 crc kubenswrapper[4962]: I1003 12:50:08.338026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e"} Oct 03 12:50:08 crc kubenswrapper[4962]: I1003 12:50:08.338135 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:08 crc kubenswrapper[4962]: I1003 12:50:08.339042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:08 crc kubenswrapper[4962]: I1003 12:50:08.339088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:08 crc kubenswrapper[4962]: I1003 12:50:08.339100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.231235 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.231318 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.814717 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.814832 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.815254 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.815753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.815799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.815817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:10 crc kubenswrapper[4962]: I1003 12:50:10.820797 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:50:11 crc kubenswrapper[4962]: I1003 12:50:11.346058 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:11 crc kubenswrapper[4962]: I1003 12:50:11.347952 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:11 crc kubenswrapper[4962]: I1003 12:50:11.348003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:11 crc kubenswrapper[4962]: I1003 12:50:11.348021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:11 crc kubenswrapper[4962]: I1003 12:50:11.937515 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.075061 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.075980 4962 trace.go:236] Trace[1684289204]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 12:49:59.198) (total time: 12877ms): Oct 03 12:50:12 crc kubenswrapper[4962]: Trace[1684289204]: ---"Objects listed" error: 12877ms (12:50:12.075) Oct 03 12:50:12 crc kubenswrapper[4962]: Trace[1684289204]: [12.877096485s] [12.877096485s] END Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.076016 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.094564 4962 trace.go:236] Trace[36852985]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 12:49:58.709) (total time: 13384ms): Oct 03 12:50:12 crc kubenswrapper[4962]: Trace[36852985]: ---"Objects listed" error: 13384ms (12:50:12.094) Oct 03 12:50:12 crc kubenswrapper[4962]: Trace[36852985]: [13.384772775s] [13.384772775s] END Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.094614 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.095336 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.096344 4962 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.096742 4962 trace.go:236] Trace[614486104]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 12:49:59.621) (total time: 12475ms): Oct 03 12:50:12 crc kubenswrapper[4962]: Trace[614486104]: ---"Objects listed" error: 12475ms (12:50:12.096) Oct 03 12:50:12 crc kubenswrapper[4962]: Trace[614486104]: [12.475401519s] [12.475401519s] END Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.096756 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.173988 4962 apiserver.go:52] "Watching apiserver" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.177523 4962 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.178430 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.178893 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.178956 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.179205 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.179919 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.180408 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.180529 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.181071 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.181088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.181149 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.183165 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.184004 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.184256 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.184390 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.184596 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.184686 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.185077 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.193364 4962 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.196767 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.196826 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.196853 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.196880 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.196908 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.196930 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.196955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.196979 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197003 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197012 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197202 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197027 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197303 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197333 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197361 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197380 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197382 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197431 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197451 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197468 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197482 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197498 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197514 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197518 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197529 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197830 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197867 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197883 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197904 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197936 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197952 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197968 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197985 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198020 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198038 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198092 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198143 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198167 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198187 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198220 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198246 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198298 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198321 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198344 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198364 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198403 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198425 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198446 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198467 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198489 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198511 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198533 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198556 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198581 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198672 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198704 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198757 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198867 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198897 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198975 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199000 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199025 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199050 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199071 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199096 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199119 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199143 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199165 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199188 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199214 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199292 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199349 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199367 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197585 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199440 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197586 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199461 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199479 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199496 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199513 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199529 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199545 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199561 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199583 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199605 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199659 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199681 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199699 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199730 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199766 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199784 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199802 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199831 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199866 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199882 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199901 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199918 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199934 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199948 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199964 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199980 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199996 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200013 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200028 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200044 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200061 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200077 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200110 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200125 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200144 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200178 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200195 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200210 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200242 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200274 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200290 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200324 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200340 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200355 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200371 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200387 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200404 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200486 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200503 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200538 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200561 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200584 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200601 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200619 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202775 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202821 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202873 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202894 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202917 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202940 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202973 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.202996 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203019 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203042 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203063 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203110 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203134 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203157 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203177 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203197 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203219 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203261 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203283 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203304 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203325 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203348 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203370 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203390 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203414 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203459 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203479 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203521 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203565 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203585 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203606 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203628 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203671 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203694 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203759 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203780 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203825 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203873 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203896 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203919 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203940 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.203962 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204016 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204045 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204071 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204098 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204125 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204173 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204219 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204265 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204289 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204338 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204405 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204422 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204436 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204449 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204462 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204481 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204494 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.221018 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.221239 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197743 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.197952 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198001 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198209 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198432 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198463 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198544 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198611 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198709 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198784 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198795 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.198990 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199012 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199026 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199111 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199221 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199308 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199375 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199423 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199763 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199780 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.199979 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200067 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.200615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.201062 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.201302 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.201380 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.204397 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.235620 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.237782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.238320 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.238339 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.238351 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.238459 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.239151 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.241499 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.241523 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.241533 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.244746 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.244807 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.245268 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.245331 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.245362 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.245614 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.245649 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226427 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.205839 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:50:12.705815229 +0000 UTC m=+21.109713064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.245823 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:12.745796458 +0000 UTC m=+21.149694293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.245842 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:12.745834549 +0000 UTC m=+21.149732384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.206262 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.217177 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.217190 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.217230 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.217446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.217752 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.217791 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.218029 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.218730 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.218781 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.218836 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.218912 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.218989 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.219011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.219143 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.219201 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.219263 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.219674 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.219917 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.246073 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:12.746060795 +0000 UTC m=+21.149958630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.221158 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.221623 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.222202 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.222384 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.222436 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.222447 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.222682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.222868 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.222928 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.223309 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.246190 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:12.746183229 +0000 UTC m=+21.150081064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.224145 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.224417 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.224436 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.224518 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.225104 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.225203 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.225434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.225269 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.225972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226076 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226267 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226474 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226674 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226766 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226892 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226913 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.226934 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.227400 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.227425 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.227602 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.230042 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.230396 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.230535 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.230697 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.230756 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.231131 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.232450 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.232608 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.232970 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.233044 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.233105 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.233210 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.233342 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.233420 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.223008 4962 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.247829 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.248072 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.248314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.248627 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.245381 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.248766 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.249137 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.249213 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.249294 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.250067 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.252169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.252427 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.252956 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.253012 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.253040 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.253182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.253270 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.253323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.253314 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.253811 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.256666 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.256966 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.257116 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.257134 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.259526 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.260230 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.260264 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.259860 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.259952 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.260340 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.260400 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.260943 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.260957 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.261145 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.261276 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.261626 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.262035 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.262179 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.262237 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.262949 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.264239 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.265516 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.266260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.266341 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.266570 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.266658 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.266704 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.266987 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267004 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267318 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267428 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267552 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267600 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267570 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267705 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.267864 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.268036 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.268287 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.268529 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.268623 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.268913 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.269051 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.270039 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.270125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.270288 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.270312 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.270378 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272022 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272034 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272187 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272264 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272312 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272369 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272459 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.272823 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.273253 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.273307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.273592 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.273618 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.273857 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.273880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.274277 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.274285 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.275294 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.275959 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.275981 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.276089 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.275984 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.276998 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.278545 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.280845 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.281700 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.283214 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.284214 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.286424 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.286905 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.288327 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.288865 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.289197 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.289254 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.290812 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.291144 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.293821 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.295311 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.295814 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.300662 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.305956 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306095 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306182 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306197 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306210 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306220 4962 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306230 4962 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306239 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306248 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306257 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306266 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306274 4962 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306309 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306324 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306335 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306345 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306355 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306385 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306397 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306407 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306416 4962 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306424 4962 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306434 4962 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306443 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306471 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306480 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306490 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306498 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306506 4962 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306514 4962 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306522 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306530 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306538 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306546 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306555 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306564 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306575 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306586 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306597 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306212 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306670 4962 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306719 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306748 4962 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306955 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.306986 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307050 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307064 4962 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307078 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307091 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307102 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307112 4962 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307122 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307131 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307143 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307157 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307169 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307183 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307194 4962 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307204 4962 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307213 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307222 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307233 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307247 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307260 4962 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307272 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307283 4962 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307297 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307309 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307322 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307333 4962 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307345 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307364 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307377 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307391 4962 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307405 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307418 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307431 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307444 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307456 4962 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307468 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307480 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307492 4962 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307504 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307517 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307528 4962 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307540 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307552 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307565 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307576 4962 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307589 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307600 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307612 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307624 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307652 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307664 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307677 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307700 4962 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307710 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307719 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307728 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307736 4962 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307746 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307755 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307764 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307773 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307782 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307791 4962 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307802 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307811 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307820 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307830 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307839 4962 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307848 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307857 4962 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307867 4962 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307876 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307884 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307894 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307904 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307916 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307925 4962 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307934 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307943 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307951 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307959 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307969 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307977 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307987 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.307996 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308004 4962 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308013 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308022 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308030 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308038 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308047 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308055 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308064 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308073 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308081 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308091 4962 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308100 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308109 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308117 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308126 4962 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308135 4962 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308143 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308152 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308160 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308168 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308178 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308188 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308198 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308207 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308216 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308225 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308234 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308242 4962 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308250 4962 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308259 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308267 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308276 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308285 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308295 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308304 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308312 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308354 4962 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308362 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308370 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308381 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308391 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308399 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308408 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308416 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308425 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308433 4962 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308441 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308450 4962 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308459 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308467 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308476 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308485 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308495 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308504 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308514 4962 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308524 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308532 4962 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308540 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308549 4962 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308557 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308566 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.308574 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.310387 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.320896 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.333059 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.342773 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.354031 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.363161 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.367492 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.378846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.517762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.533460 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 12:50:12 crc kubenswrapper[4962]: W1003 12:50:12.533921 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-79d67ad2368bc47e9f82c1b8c02f17dc86816047757ac8b6ae7858fbab1a158e WatchSource:0}: Error finding container 79d67ad2368bc47e9f82c1b8c02f17dc86816047757ac8b6ae7858fbab1a158e: Status 404 returned error can't find the container with id 79d67ad2368bc47e9f82c1b8c02f17dc86816047757ac8b6ae7858fbab1a158e Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.552824 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 12:50:12 crc kubenswrapper[4962]: W1003 12:50:12.567169 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-31a03527ce02389a7b4733bbd5bb91aeddbb0709348ccc21e6dd83307af19437 WatchSource:0}: Error finding container 31a03527ce02389a7b4733bbd5bb91aeddbb0709348ccc21e6dd83307af19437: Status 404 returned error can't find the container with id 31a03527ce02389a7b4733bbd5bb91aeddbb0709348ccc21e6dd83307af19437 Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.712237 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.712414 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:50:13.712387588 +0000 UTC m=+22.116285423 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.813187 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.813233 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.813252 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.813271 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813371 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813404 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813412 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813386 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813430 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813437 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813464 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:13.81344755 +0000 UTC m=+22.217345385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813492 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:13.813471681 +0000 UTC m=+22.217369506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813417 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813510 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813520 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:13.813514892 +0000 UTC m=+22.217412727 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:12 crc kubenswrapper[4962]: E1003 12:50:12.813533 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:13.813526852 +0000 UTC m=+22.217424687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.871034 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.882030 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.883483 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.888603 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.895835 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.909655 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.921488 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.933282 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.943862 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.961672 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:12 crc kubenswrapper[4962]: I1003 12:50:12.978235 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.012124 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.023268 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.035262 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.045292 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.057197 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.073588 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.084394 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.350725 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"31a03527ce02389a7b4733bbd5bb91aeddbb0709348ccc21e6dd83307af19437"} Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.352352 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b"} Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.352415 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b"} Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.352432 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"13f238cc50114ecb8eb751135834ac2b5f84babcc5a1bd02947554c694ceeb62"} Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.353818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f"} Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.353893 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"79d67ad2368bc47e9f82c1b8c02f17dc86816047757ac8b6ae7858fbab1a158e"} Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.360690 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pnhsr"] Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.361223 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.365886 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.365928 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.365957 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.365959 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.365897 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-g64qv"] Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.366284 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g64qv" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.370891 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.370973 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.370994 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.382982 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.407974 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.418287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49-hosts-file\") pod \"node-resolver-g64qv\" (UID: \"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\") " pod="openshift-dns/node-resolver-g64qv" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.418329 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rc5\" (UniqueName: \"kubernetes.io/projected/a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49-kube-api-access-b7rc5\") pod \"node-resolver-g64qv\" (UID: \"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\") " pod="openshift-dns/node-resolver-g64qv" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.418381 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqjd\" (UniqueName: \"kubernetes.io/projected/edada57d-7295-4f56-a850-caf58ebe77a9-kube-api-access-ctqjd\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.418405 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edada57d-7295-4f56-a850-caf58ebe77a9-host\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.418585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/edada57d-7295-4f56-a850-caf58ebe77a9-serviceca\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.426434 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.464175 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.487785 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.518798 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.519126 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqjd\" (UniqueName: \"kubernetes.io/projected/edada57d-7295-4f56-a850-caf58ebe77a9-kube-api-access-ctqjd\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.519164 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edada57d-7295-4f56-a850-caf58ebe77a9-host\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.519190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/edada57d-7295-4f56-a850-caf58ebe77a9-serviceca\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.519256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49-hosts-file\") pod \"node-resolver-g64qv\" (UID: \"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\") " pod="openshift-dns/node-resolver-g64qv" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.519290 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7rc5\" (UniqueName: \"kubernetes.io/projected/a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49-kube-api-access-b7rc5\") pod \"node-resolver-g64qv\" (UID: \"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\") " pod="openshift-dns/node-resolver-g64qv" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.519258 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edada57d-7295-4f56-a850-caf58ebe77a9-host\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.519348 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49-hosts-file\") pod \"node-resolver-g64qv\" (UID: \"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\") " pod="openshift-dns/node-resolver-g64qv" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.521361 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/edada57d-7295-4f56-a850-caf58ebe77a9-serviceca\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.562802 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.567673 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7rc5\" (UniqueName: \"kubernetes.io/projected/a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49-kube-api-access-b7rc5\") pod \"node-resolver-g64qv\" (UID: \"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\") " pod="openshift-dns/node-resolver-g64qv" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.567720 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqjd\" (UniqueName: \"kubernetes.io/projected/edada57d-7295-4f56-a850-caf58ebe77a9-kube-api-access-ctqjd\") pod \"node-ca-pnhsr\" (UID: \"edada57d-7295-4f56-a850-caf58ebe77a9\") " pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.576296 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.588854 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.603786 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.617554 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.634804 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.647198 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.662076 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.674191 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.676047 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pnhsr" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.683923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g64qv" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.690742 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: W1003 12:50:13.701699 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda46fb9dd_57b2_4300_b9ab_2d40bcc4cd49.slice/crio-13f1c749752b545f173ff26b48459378bcfbc5b71df9cd9b0cab4f755fd5baa2 WatchSource:0}: Error finding container 13f1c749752b545f173ff26b48459378bcfbc5b71df9cd9b0cab4f755fd5baa2: Status 404 returned error can't find the container with id 13f1c749752b545f173ff26b48459378bcfbc5b71df9cd9b0cab4f755fd5baa2 Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.722738 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.722843 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:50:15.722815036 +0000 UTC m=+24.126712911 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.722700 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.738112 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:13Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.823407 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.823457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.823494 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:13 crc kubenswrapper[4962]: I1003 12:50:13.823515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.823592 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.823667 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:15.823649392 +0000 UTC m=+24.227547227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824053 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824100 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:15.824088915 +0000 UTC m=+24.227986750 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824168 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824186 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824199 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824229 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:15.824219859 +0000 UTC m=+24.228117694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824283 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824296 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824306 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:13 crc kubenswrapper[4962]: E1003 12:50:13.824334 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:15.824324962 +0000 UTC m=+24.228222797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.226221 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.226245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.226250 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:14 crc kubenswrapper[4962]: E1003 12:50:14.226335 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:14 crc kubenswrapper[4962]: E1003 12:50:14.226439 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:14 crc kubenswrapper[4962]: E1003 12:50:14.226476 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.229875 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.230420 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.231331 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.231952 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.232508 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.233041 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.233694 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.234230 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.234921 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.235470 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.236028 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.236710 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.237227 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.237711 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.238324 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.241577 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.242257 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.242618 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.243592 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.245443 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.245949 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.250556 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.257377 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.258126 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.260150 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.260770 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.261873 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.262327 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.266807 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.267306 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.267768 4962 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.268045 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.270313 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.271121 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.271944 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.273318 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.273924 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.274800 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.275408 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.276486 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-46vck"] Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.276756 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-44fmz"] Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.276995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.277454 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-sdd6t"] Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.277591 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.277676 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ksp7d"] Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.277893 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.278561 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.280019 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.280072 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.281198 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.281249 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.281255 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.281210 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.281358 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.281678 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.281846 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.281947 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.282047 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.282074 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.282054 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.282194 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.282222 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.282234 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.282222 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.282344 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.283279 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.293423 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.305481 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.319066 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.326792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cni-binary-copy\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.326838 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-netns\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.326861 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-kubelet\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.326882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-systemd-units\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.326934 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-cni-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327171 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-k8s-cni-cncf-io\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327192 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-hostroot\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-netns\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327327 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-script-lib\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327350 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-multus-certs\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-log-socket\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327413 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-bin\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-system-cni-dir\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-os-release\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327522 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cnibin\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327572 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327594 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-var-lib-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327653 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-ovn-kubernetes\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327679 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-cnibin\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327733 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-daemon-config\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327755 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdwhf\" (UniqueName: \"kubernetes.io/projected/90186d9d-0ac4-4959-9fd8-b044098dc6ae-kube-api-access-qdwhf\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxhzn\" (UniqueName: \"kubernetes.io/projected/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-kube-api-access-wxhzn\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-ovn\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327881 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-env-overrides\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e40a27aa-682e-4b25-a198-8054ba9f2477-rootfs\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.327981 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40a27aa-682e-4b25-a198-8054ba9f2477-proxy-tls\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328005 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktptf\" (UniqueName: \"kubernetes.io/projected/fbc64268-3e78-44a2-8116-b62b5c13f005-kube-api-access-ktptf\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328067 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-slash\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328094 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328143 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-etc-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-etc-kubernetes\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328236 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-node-log\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328315 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbr9\" (UniqueName: \"kubernetes.io/projected/e40a27aa-682e-4b25-a198-8054ba9f2477-kube-api-access-jdbr9\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328375 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-conf-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328425 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-os-release\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328447 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-systemd\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328472 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fbc64268-3e78-44a2-8116-b62b5c13f005-cni-binary-copy\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328491 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-config\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-cni-bin\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-cni-multus\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-netd\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328573 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovn-node-metrics-cert\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328623 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40a27aa-682e-4b25-a198-8054ba9f2477-mcd-auth-proxy-config\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328675 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-system-cni-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328699 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-socket-dir-parent\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.328720 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-kubelet\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.334212 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.346862 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.357219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g64qv" event={"ID":"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49","Type":"ContainerStarted","Data":"91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497"} Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.357279 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g64qv" event={"ID":"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49","Type":"ContainerStarted","Data":"13f1c749752b545f173ff26b48459378bcfbc5b71df9cd9b0cab4f755fd5baa2"} Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.358593 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pnhsr" event={"ID":"edada57d-7295-4f56-a850-caf58ebe77a9","Type":"ContainerStarted","Data":"249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd"} Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.358626 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pnhsr" event={"ID":"edada57d-7295-4f56-a850-caf58ebe77a9","Type":"ContainerStarted","Data":"7ea751829a6452f45967e539d076ef2506a5a4c4edc12258dac6bfbc6c5a7e27"} Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.364565 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.377715 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.387758 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.397592 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.407703 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.420297 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430089 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-multus-certs\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-log-socket\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430163 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-bin\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-os-release\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-system-cni-dir\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cnibin\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-multus-certs\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430317 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-var-lib-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-bin\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-ovn-kubernetes\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430407 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-cnibin\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-ovn-kubernetes\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430426 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-daemon-config\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdwhf\" (UniqueName: \"kubernetes.io/projected/90186d9d-0ac4-4959-9fd8-b044098dc6ae-kube-api-access-qdwhf\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430449 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-system-cni-dir\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxhzn\" (UniqueName: \"kubernetes.io/projected/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-kube-api-access-wxhzn\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cnibin\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430477 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-ovn\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430498 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-env-overrides\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430302 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-log-socket\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e40a27aa-682e-4b25-a198-8054ba9f2477-rootfs\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40a27aa-682e-4b25-a198-8054ba9f2477-proxy-tls\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430544 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktptf\" (UniqueName: \"kubernetes.io/projected/fbc64268-3e78-44a2-8116-b62b5c13f005-kube-api-access-ktptf\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430558 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-slash\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430558 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-var-lib-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430592 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430611 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-ovn\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430618 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-os-release\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430633 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-etc-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430713 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-cnibin\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430732 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-etc-kubernetes\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430747 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-node-log\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430769 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430788 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-conf-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430819 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbr9\" (UniqueName: \"kubernetes.io/projected/e40a27aa-682e-4b25-a198-8054ba9f2477-kube-api-access-jdbr9\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430880 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-os-release\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-systemd\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fbc64268-3e78-44a2-8116-b62b5c13f005-cni-binary-copy\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430945 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-config\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430958 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-etc-kubernetes\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430963 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-cni-bin\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430977 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-cni-bin\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430990 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-node-log\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.430996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-cni-multus\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431005 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431016 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-netd\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431036 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovn-node-metrics-cert\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431073 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431086 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40a27aa-682e-4b25-a198-8054ba9f2477-mcd-auth-proxy-config\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-system-cni-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431129 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-socket-dir-parent\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431301 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-conf-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431357 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-os-release\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431384 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-systemd\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431572 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fbc64268-3e78-44a2-8116-b62b5c13f005-cni-binary-copy\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-kubelet\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431796 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cni-binary-copy\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-netns\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-kubelet\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-systemd-units\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.431926 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-cni-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432019 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e40a27aa-682e-4b25-a198-8054ba9f2477-rootfs\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432033 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-socket-dir-parent\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-netd\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432072 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-config\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432105 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-etc-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432106 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-netns\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432143 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-openvswitch\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432204 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-kubelet\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-systemd-units\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40a27aa-682e-4b25-a198-8054ba9f2477-mcd-auth-proxy-config\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432791 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-cni-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432793 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-env-overrides\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432935 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fbc64268-3e78-44a2-8116-b62b5c13f005-multus-daemon-config\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-system-cni-dir\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.432994 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-kubelet\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.433020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-k8s-cni-cncf-io\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.433036 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-hostroot\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.433050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-netns\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.433064 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-script-lib\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.433888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-run-k8s-cni-cncf-io\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.433917 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-hostroot\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.433939 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-netns\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.434320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-cni-binary-copy\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.434362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-slash\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.434387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fbc64268-3e78-44a2-8116-b62b5c13f005-host-var-lib-cni-multus\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.434428 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-script-lib\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.434781 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.436039 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovn-node-metrics-cert\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.444955 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40a27aa-682e-4b25-a198-8054ba9f2477-proxy-tls\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.449852 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktptf\" (UniqueName: \"kubernetes.io/projected/fbc64268-3e78-44a2-8116-b62b5c13f005-kube-api-access-ktptf\") pod \"multus-sdd6t\" (UID: \"fbc64268-3e78-44a2-8116-b62b5c13f005\") " pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.452528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdwhf\" (UniqueName: \"kubernetes.io/projected/90186d9d-0ac4-4959-9fd8-b044098dc6ae-kube-api-access-qdwhf\") pod \"ovnkube-node-ksp7d\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.456180 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.457038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbr9\" (UniqueName: \"kubernetes.io/projected/e40a27aa-682e-4b25-a198-8054ba9f2477-kube-api-access-jdbr9\") pod \"machine-config-daemon-46vck\" (UID: \"e40a27aa-682e-4b25-a198-8054ba9f2477\") " pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.461766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxhzn\" (UniqueName: \"kubernetes.io/projected/c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7-kube-api-access-wxhzn\") pod \"multus-additional-cni-plugins-44fmz\" (UID: \"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\") " pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.466014 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.476657 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.494392 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.508413 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.523680 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.535224 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.548291 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.558305 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.568053 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.586351 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.588430 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.596816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-44fmz" Oct 03 12:50:14 crc kubenswrapper[4962]: W1003 12:50:14.597842 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40a27aa_682e_4b25_a198_8054ba9f2477.slice/crio-da9440b012c31f339ce16fbd689b3e98736ff0f5287cbe7d595d7e55a26fc025 WatchSource:0}: Error finding container da9440b012c31f339ce16fbd689b3e98736ff0f5287cbe7d595d7e55a26fc025: Status 404 returned error can't find the container with id da9440b012c31f339ce16fbd689b3e98736ff0f5287cbe7d595d7e55a26fc025 Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.603002 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sdd6t" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.605865 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: W1003 12:50:14.607609 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a0c0a2_c2cf_4e0a_b82c_5f56c2fccec7.slice/crio-3a62a6fbf251b09e3d68341226f86666d83be88630a7f6945fed5e4726cfd0cd WatchSource:0}: Error finding container 3a62a6fbf251b09e3d68341226f86666d83be88630a7f6945fed5e4726cfd0cd: Status 404 returned error can't find the container with id 3a62a6fbf251b09e3d68341226f86666d83be88630a7f6945fed5e4726cfd0cd Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.610296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:14 crc kubenswrapper[4962]: I1003 12:50:14.621912 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:14Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:14 crc kubenswrapper[4962]: W1003 12:50:14.626476 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbc64268_3e78_44a2_8116_b62b5c13f005.slice/crio-19025914c23b2e9cbe71d4df3f5f7a8381c4eb135068935000b1bf3d5030af9e WatchSource:0}: Error finding container 19025914c23b2e9cbe71d4df3f5f7a8381c4eb135068935000b1bf3d5030af9e: Status 404 returned error can't find the container with id 19025914c23b2e9cbe71d4df3f5f7a8381c4eb135068935000b1bf3d5030af9e Oct 03 12:50:14 crc kubenswrapper[4962]: W1003 12:50:14.640312 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90186d9d_0ac4_4959_9fd8_b044098dc6ae.slice/crio-9730df0aa7f1039b437ee08c7070e4944ec76d68d0870e8e1bf954355236280d WatchSource:0}: Error finding container 9730df0aa7f1039b437ee08c7070e4944ec76d68d0870e8e1bf954355236280d: Status 404 returned error can't find the container with id 9730df0aa7f1039b437ee08c7070e4944ec76d68d0870e8e1bf954355236280d Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.364078 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2" exitCode=0 Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.364139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.364440 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"9730df0aa7f1039b437ee08c7070e4944ec76d68d0870e8e1bf954355236280d"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.366348 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.366412 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.366426 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"da9440b012c31f339ce16fbd689b3e98736ff0f5287cbe7d595d7e55a26fc025"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.367607 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.369675 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sdd6t" event={"ID":"fbc64268-3e78-44a2-8116-b62b5c13f005","Type":"ContainerStarted","Data":"087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.369711 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sdd6t" event={"ID":"fbc64268-3e78-44a2-8116-b62b5c13f005","Type":"ContainerStarted","Data":"19025914c23b2e9cbe71d4df3f5f7a8381c4eb135068935000b1bf3d5030af9e"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.372193 4962 generic.go:334] "Generic (PLEG): container finished" podID="c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7" containerID="16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4" exitCode=0 Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.372230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" event={"ID":"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7","Type":"ContainerDied","Data":"16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.372250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" event={"ID":"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7","Type":"ContainerStarted","Data":"3a62a6fbf251b09e3d68341226f86666d83be88630a7f6945fed5e4726cfd0cd"} Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.380410 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.398153 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.415292 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.428149 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.442934 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.459499 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.481355 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.509621 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.545086 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.563352 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.572863 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.583570 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.595153 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.611806 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.624535 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.636598 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.655517 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.666870 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.686114 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.701727 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.716860 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.729622 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.743609 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.746267 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.746548 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:50:19.746524125 +0000 UTC m=+28.150421960 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.754715 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.765997 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.778088 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.791051 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.808695 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:15Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.847341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.847388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.847407 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:15 crc kubenswrapper[4962]: I1003 12:50:15.847432 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847520 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847542 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847557 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847568 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847598 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:19.847570337 +0000 UTC m=+28.251468172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847612 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:19.847606068 +0000 UTC m=+28.251503903 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847630 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847661 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847667 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847696 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:19.84768275 +0000 UTC m=+28.251580575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847731 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:15 crc kubenswrapper[4962]: E1003 12:50:15.847750 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:19.847744112 +0000 UTC m=+28.251641937 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.226692 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:16 crc kubenswrapper[4962]: E1003 12:50:16.227118 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.227184 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:16 crc kubenswrapper[4962]: E1003 12:50:16.227282 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.227338 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:16 crc kubenswrapper[4962]: E1003 12:50:16.227381 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.391980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.392039 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.392052 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.392065 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.392080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.392091 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.395014 4962 generic.go:334] "Generic (PLEG): container finished" podID="c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7" containerID="f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4" exitCode=0 Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.395278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" event={"ID":"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7","Type":"ContainerDied","Data":"f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4"} Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.408805 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.420327 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.430810 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.443531 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.465954 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.484514 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.501089 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.512234 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.523201 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.533408 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.544163 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.561692 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.575421 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:16 crc kubenswrapper[4962]: I1003 12:50:16.595421 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:16Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.234947 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.238788 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.243036 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.246808 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.257378 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.267445 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.277733 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.297214 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.311547 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.335361 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.358193 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.380492 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.398826 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.398939 4962 generic.go:334] "Generic (PLEG): container finished" podID="c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7" containerID="2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189" exitCode=0 Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.398973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" event={"ID":"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7","Type":"ContainerDied","Data":"2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189"} Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.409698 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.424517 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.442205 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.455305 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.474136 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.486600 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.497105 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.508285 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.518720 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.530189 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.543686 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.562372 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.574304 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.587552 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.597170 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.607980 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.627460 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.637829 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:17 crc kubenswrapper[4962]: I1003 12:50:17.649188 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:17Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.226934 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.227042 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.227185 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.227848 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.227872 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.228017 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.404106 4962 generic.go:334] "Generic (PLEG): container finished" podID="c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7" containerID="d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5" exitCode=0 Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.404157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" event={"ID":"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7","Type":"ContainerDied","Data":"d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.429310 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.444281 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.453771 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.463648 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.474793 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.486086 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.495690 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.497015 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.497795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.497822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.497832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.497977 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.504233 4962 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.504441 4962 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.507308 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.507340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.507348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.507363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.507373 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.508452 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.520648 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.522625 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.527771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.527806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.527815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.527828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.527837 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.534044 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.539782 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.542728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.542754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.542762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.542775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.542784 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.545245 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.556346 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.556713 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.560269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.560355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.560371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.560389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.560405 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.576820 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.577760 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.583566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.583602 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.583614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.583630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.583651 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.590257 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.594786 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: E1003 12:50:18.594966 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.596800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.596824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.596833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.596849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.596860 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.603764 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:18Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.698613 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.698665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.698678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.698693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.698703 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.800850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.800888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.800899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.800942 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.800962 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.903562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.903597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.903605 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.903618 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:18 crc kubenswrapper[4962]: I1003 12:50:18.903627 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:18Z","lastTransitionTime":"2025-10-03T12:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.005917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.005967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.005980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.005997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.006010 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.108907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.108935 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.108942 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.108955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.108966 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.211114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.211163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.211181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.211205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.211223 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.313407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.313458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.313469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.313516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.313529 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.411097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.415141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.415183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.415199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.415214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.415226 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.415777 4962 generic.go:334] "Generic (PLEG): container finished" podID="c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7" containerID="1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e" exitCode=0 Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.415817 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" event={"ID":"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7","Type":"ContainerDied","Data":"1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.431344 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.447164 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.461889 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.479136 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.490983 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.508070 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.518267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.518315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.518324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.518338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.518346 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.522251 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.539410 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.557414 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.570560 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.580695 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.591438 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.611344 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.620901 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.620928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.620936 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.620951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.620962 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.625461 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.635111 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.726308 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.726356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.726391 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.726408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.726416 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.782995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.783246 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:50:27.783214405 +0000 UTC m=+36.187112260 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.828597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.828627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.828659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.828672 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.828681 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.883686 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.883749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.883777 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.883794 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883868 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883901 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883919 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883929 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883936 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883942 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883946 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883925 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:27.883907157 +0000 UTC m=+36.287804992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883900 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.883989 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:27.883974079 +0000 UTC m=+36.287871914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.884001 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:27.883996149 +0000 UTC m=+36.287893984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:19 crc kubenswrapper[4962]: E1003 12:50:19.884011 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:27.884005789 +0000 UTC m=+36.287903624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.931212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.931271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.931289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.931314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:19 crc kubenswrapper[4962]: I1003 12:50:19.931331 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:19Z","lastTransitionTime":"2025-10-03T12:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.033699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.033734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.033743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.033756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.033764 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.136087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.136120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.136129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.136141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.136149 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.226427 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.226512 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.226425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:20 crc kubenswrapper[4962]: E1003 12:50:20.226550 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:20 crc kubenswrapper[4962]: E1003 12:50:20.226611 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:20 crc kubenswrapper[4962]: E1003 12:50:20.226732 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.238423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.238467 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.238483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.238515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.238531 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.340434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.340468 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.340477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.340491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.340499 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.421244 4962 generic.go:334] "Generic (PLEG): container finished" podID="c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7" containerID="ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4" exitCode=0 Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.421284 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" event={"ID":"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7","Type":"ContainerDied","Data":"ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.434709 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.442611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.442664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.442674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.442689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.442701 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.445527 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.454501 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.466066 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.487762 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.502014 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.515143 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.525764 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.538214 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.544445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.544478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.544487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.544508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.544519 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.549748 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.561575 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.571379 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.593834 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.606461 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.620259 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.646795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.646941 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.647034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.647119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.647198 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.749985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.750026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.750036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.750053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.750063 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.852031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.852052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.852060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.852074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.852084 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.954295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.954328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.954337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.954351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:20 crc kubenswrapper[4962]: I1003 12:50:20.954363 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:20Z","lastTransitionTime":"2025-10-03T12:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.056526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.056563 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.056574 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.056592 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.056602 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.159208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.159256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.159265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.159281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.159292 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.261361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.261398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.261407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.261423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.261433 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.363702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.363740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.363750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.363765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.363777 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.429878 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" event={"ID":"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7","Type":"ContainerStarted","Data":"4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.435033 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.435406 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.435464 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.461119 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.462788 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.466620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.466925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.466934 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.466949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.466958 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.468394 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.484033 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.499188 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.519455 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.533523 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.542593 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.553426 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.565324 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.569591 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.569656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.569666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.569682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.569691 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.580308 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.594921 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.609149 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.620759 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.633011 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.644745 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.654859 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.666001 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.671407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.671623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.671710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.671777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.671839 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.677756 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.689148 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.699388 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.711421 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.723075 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.737573 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.752318 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.769578 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.774159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.774214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.774234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.774259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.774277 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.782347 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.792535 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.801810 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.812126 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.827612 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.838269 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:21Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.876310 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.876347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.876357 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.876370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.876380 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.978524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.978563 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.978575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.978591 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:21 crc kubenswrapper[4962]: I1003 12:50:21.978602 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:21Z","lastTransitionTime":"2025-10-03T12:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.080625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.080691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.080701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.080718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.080727 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.182790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.182822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.182831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.182848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.182857 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.227156 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.227160 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.227245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:22 crc kubenswrapper[4962]: E1003 12:50:22.227377 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:22 crc kubenswrapper[4962]: E1003 12:50:22.227695 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:22 crc kubenswrapper[4962]: E1003 12:50:22.227777 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.245399 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.266317 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.284125 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.285470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.285525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.285542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.285565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.285582 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.295559 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.306744 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.329430 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.340064 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.348244 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.362904 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.378649 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.387164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.387189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.387197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.387211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.387220 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.389591 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.401165 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.412034 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.423189 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.434494 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.440545 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.489718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.489759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.489768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.489784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.489796 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.592837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.593138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.593255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.593355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.593441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.696023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.696260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.696351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.696448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.696532 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.798484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.798514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.798522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.798535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.798544 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.902370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.902407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.902416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.902430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:22 crc kubenswrapper[4962]: I1003 12:50:22.902440 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:22Z","lastTransitionTime":"2025-10-03T12:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.004849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.004892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.004945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.004971 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.004984 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.107295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.107369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.107411 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.107427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.107438 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.209198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.209228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.209238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.209250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.209258 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.311944 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.311993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.312001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.312023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.312034 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.414405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.414442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.414454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.414470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.414481 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.442872 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.516950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.516979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.516987 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.516999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.517007 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.619853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.619889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.619899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.619914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.619925 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.721258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.721291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.721299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.721311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.721320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.824016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.824046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.824054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.824067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.824076 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.926784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.926825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.926835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.926851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:23 crc kubenswrapper[4962]: I1003 12:50:23.926862 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:23Z","lastTransitionTime":"2025-10-03T12:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.029311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.029344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.029352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.029367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.029375 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.135404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.135436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.135446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.135459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.135468 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.226487 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.226584 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:24 crc kubenswrapper[4962]: E1003 12:50:24.226658 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:24 crc kubenswrapper[4962]: E1003 12:50:24.226765 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.226877 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:24 crc kubenswrapper[4962]: E1003 12:50:24.226945 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.237373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.237417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.237430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.237446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.237459 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.339620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.339670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.339679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.339692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.339700 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.441990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.442032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.442076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.442102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.442115 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.446485 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/0.log" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.449086 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5" exitCode=1 Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.449129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.449821 4962 scope.go:117] "RemoveContainer" containerID="a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.466499 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.477748 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.488871 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.503986 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.521420 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:23Z\\\",\\\"message\\\":\\\"003 12:50:23.655440 6287 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.655594 6287 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.656112 6287 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:23.656153 6287 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 12:50:23.656159 6287 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:23.656170 6287 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:23.656176 6287 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:23.656198 6287 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:23.656206 6287 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:23.656215 6287 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:23.656221 6287 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:23.656228 6287 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:23.656432 6287 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.534400 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.534499 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.544767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.544807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.544816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.544829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.544838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.548299 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.560430 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.574810 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.589015 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.598623 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.612175 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.630259 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.641831 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.646652 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.646692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.646704 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.646723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.646735 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.656756 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.671135 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.683292 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.695057 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.706682 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.717706 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.728239 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.740273 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.752177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.752209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.752218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.752236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.752247 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.760600 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.775434 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.790820 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.803321 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.824554 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.835447 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.844746 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.854659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.854698 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.854707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.854722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.854731 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.862185 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:23Z\\\",\\\"message\\\":\\\"003 12:50:23.655440 6287 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.655594 6287 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.656112 6287 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:23.656153 6287 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 12:50:23.656159 6287 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:23.656170 6287 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:23.656176 6287 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:23.656198 6287 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:23.656206 6287 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:23.656215 6287 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:23.656221 6287 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:23.656228 6287 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:23.656432 6287 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:24Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.956932 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.956970 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.956981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.956996 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:24 crc kubenswrapper[4962]: I1003 12:50:24.957007 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:24Z","lastTransitionTime":"2025-10-03T12:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.059188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.059246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.059261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.059277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.059288 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.161131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.161161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.161169 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.161182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.161192 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.263816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.264109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.264173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.264261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.264323 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.366717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.366752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.366763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.366780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.366792 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.454414 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/1.log" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.454863 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/0.log" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.457621 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00" exitCode=1 Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.457743 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.457797 4962 scope.go:117] "RemoveContainer" containerID="a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.458608 4962 scope.go:117] "RemoveContainer" containerID="950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00" Oct 03 12:50:25 crc kubenswrapper[4962]: E1003 12:50:25.458790 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.468756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.468790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.468828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.468845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.468857 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.470468 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.481429 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.493360 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.509616 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:23Z\\\",\\\"message\\\":\\\"003 12:50:23.655440 6287 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.655594 6287 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.656112 6287 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:23.656153 6287 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 12:50:23.656159 6287 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:23.656170 6287 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:23.656176 6287 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:23.656198 6287 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:23.656206 6287 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:23.656215 6287 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:23.656221 6287 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:23.656228 6287 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:23.656432 6287 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.519442 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.533851 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.546801 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.557421 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.568152 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.570304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.570334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.570345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.570383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.570394 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.579911 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.590119 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.600721 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.612954 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.632910 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.645324 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.672661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.672870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.672953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.673039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.673121 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.775564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.775832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.775921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.776004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.776080 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.878179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.878210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.878219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.878233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.878242 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.980275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.980316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.980326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.980344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:25 crc kubenswrapper[4962]: I1003 12:50:25.980355 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:25Z","lastTransitionTime":"2025-10-03T12:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.004491 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs"] Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.004953 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.007668 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.009475 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.024153 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.040616 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.054558 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.065530 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.076610 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.082042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.082074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.082082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.082097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.082107 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.092428 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:23Z\\\",\\\"message\\\":\\\"003 12:50:23.655440 6287 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.655594 6287 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.656112 6287 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:23.656153 6287 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 12:50:23.656159 6287 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:23.656170 6287 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:23.656176 6287 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:23.656198 6287 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:23.656206 6287 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:23.656215 6287 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:23.656221 6287 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:23.656228 6287 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:23.656432 6287 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.102763 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.111840 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.119449 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.130827 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.141986 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.151682 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.161803 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.172178 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1198234-8682-43dc-9945-a826eba33888-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.172246 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6wvh\" (UniqueName: \"kubernetes.io/projected/d1198234-8682-43dc-9945-a826eba33888-kube-api-access-c6wvh\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.172268 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1198234-8682-43dc-9945-a826eba33888-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.172325 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1198234-8682-43dc-9945-a826eba33888-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.172460 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.184050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.184100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.184113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.184131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.184145 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.184436 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.196750 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:26Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.226323 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.226364 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.226330 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:26 crc kubenswrapper[4962]: E1003 12:50:26.226449 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:26 crc kubenswrapper[4962]: E1003 12:50:26.226559 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:26 crc kubenswrapper[4962]: E1003 12:50:26.226737 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.272867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6wvh\" (UniqueName: \"kubernetes.io/projected/d1198234-8682-43dc-9945-a826eba33888-kube-api-access-c6wvh\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.272905 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1198234-8682-43dc-9945-a826eba33888-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.272933 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1198234-8682-43dc-9945-a826eba33888-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.272954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1198234-8682-43dc-9945-a826eba33888-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.273490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1198234-8682-43dc-9945-a826eba33888-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.273714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1198234-8682-43dc-9945-a826eba33888-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.277887 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1198234-8682-43dc-9945-a826eba33888-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.285959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.286137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.286203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.286279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.286344 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.288308 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6wvh\" (UniqueName: \"kubernetes.io/projected/d1198234-8682-43dc-9945-a826eba33888-kube-api-access-c6wvh\") pod \"ovnkube-control-plane-749d76644c-kchhs\" (UID: \"d1198234-8682-43dc-9945-a826eba33888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.318466 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.388325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.388358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.388367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.388382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.388392 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.462289 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" event={"ID":"d1198234-8682-43dc-9945-a826eba33888","Type":"ContainerStarted","Data":"bfd0763fdbb213ffa36684c448229d46a3754f3b5000a5fd891438094aa89c23"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.464459 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/1.log" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.490121 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.490154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.490162 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.490176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.490188 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.592332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.592366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.592375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.592389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.592399 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.694834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.694878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.694889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.694906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.694916 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.797249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.797286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.797298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.797312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.797324 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.899709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.899739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.899746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.899760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:26 crc kubenswrapper[4962]: I1003 12:50:26.899768 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:26Z","lastTransitionTime":"2025-10-03T12:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.001908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.001952 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.001962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.001976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.001985 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.104372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.104416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.104426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.104442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.104452 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.206749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.206806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.206815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.206833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.206844 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.309460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.309526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.309544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.309567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.309584 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.411908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.411950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.411962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.411982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.412015 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.460149 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5blzz"] Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.460873 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.460978 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.471048 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" event={"ID":"d1198234-8682-43dc-9945-a826eba33888","Type":"ContainerStarted","Data":"2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.471107 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" event={"ID":"d1198234-8682-43dc-9945-a826eba33888","Type":"ContainerStarted","Data":"8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.491925 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.514566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.514608 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.514619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.514657 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.514670 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.516788 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.534292 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.546489 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.555024 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.563265 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.573037 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.587440 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:23Z\\\",\\\"message\\\":\\\"003 12:50:23.655440 6287 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.655594 6287 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.656112 6287 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:23.656153 6287 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 12:50:23.656159 6287 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:23.656170 6287 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:23.656176 6287 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:23.656198 6287 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:23.656206 6287 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:23.656215 6287 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:23.656221 6287 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:23.656228 6287 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:23.656432 6287 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.587582 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.587802 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzmw\" (UniqueName: \"kubernetes.io/projected/f2989e38-d4e7-42c9-8959-f87168a4ac14-kube-api-access-qrzmw\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.596279 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.605732 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.613565 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.616543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.616586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.616598 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.616616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.616628 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.624415 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.633752 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.643996 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.654412 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.663568 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.677567 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.688951 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzmw\" (UniqueName: \"kubernetes.io/projected/f2989e38-d4e7-42c9-8959-f87168a4ac14-kube-api-access-qrzmw\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.689012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.689140 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.689211 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs podName:f2989e38-d4e7-42c9-8959-f87168a4ac14 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:28.189191417 +0000 UTC m=+36.593089252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs") pod "network-metrics-daemon-5blzz" (UID: "f2989e38-d4e7-42c9-8959-f87168a4ac14") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.695015 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.704010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzmw\" (UniqueName: \"kubernetes.io/projected/f2989e38-d4e7-42c9-8959-f87168a4ac14-kube-api-access-qrzmw\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.708233 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.718535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.718585 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.718597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.718618 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.718658 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.723018 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.734278 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.742918 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.751624 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.760909 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.776566 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:23Z\\\",\\\"message\\\":\\\"003 12:50:23.655440 6287 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.655594 6287 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.656112 6287 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:23.656153 6287 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 12:50:23.656159 6287 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:23.656170 6287 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:23.656176 6287 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:23.656198 6287 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:23.656206 6287 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:23.656215 6287 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:23.656221 6287 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:23.656228 6287 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:23.656432 6287 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.787051 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.789937 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.790173 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:50:43.790142936 +0000 UTC m=+52.194040781 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.798481 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.808616 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.817507 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.820659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.820694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.820703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.820717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.820728 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.828451 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.839128 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.849782 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.861582 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.873846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:27Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.891424 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.891489 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.891586 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.891610 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.891693 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.891705 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.891610 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.891721 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.891727 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.891916 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.891707 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:43.891690923 +0000 UTC m=+52.295588758 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.891725 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.892032 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.892039 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:43.892006972 +0000 UTC m=+52.295904837 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.892077 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:43.892061783 +0000 UTC m=+52.295959728 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:27 crc kubenswrapper[4962]: E1003 12:50:27.892099 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:43.892089334 +0000 UTC m=+52.295987259 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.923029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.923061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.923069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.923083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:27 crc kubenswrapper[4962]: I1003 12:50:27.923092 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:27Z","lastTransitionTime":"2025-10-03T12:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.025910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.025950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.025959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.025975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.025984 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.128548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.128595 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.128604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.128618 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.128627 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.194560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.194714 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.194777 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs podName:f2989e38-d4e7-42c9-8959-f87168a4ac14 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:29.194758747 +0000 UTC m=+37.598656582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs") pod "network-metrics-daemon-5blzz" (UID: "f2989e38-d4e7-42c9-8959-f87168a4ac14") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.227204 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.227248 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.227339 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.227479 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.227200 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.227713 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.232772 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.232815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.232826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.232842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.232854 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.335863 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.335904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.335914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.335930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.335941 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.438179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.438469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.438551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.438660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.438763 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.541206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.541240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.541255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.541270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.541280 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.644067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.644104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.644112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.644125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.644134 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.669064 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.669111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.669124 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.669146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.669159 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.686433 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:28Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.691378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.691420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.691431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.691447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.691458 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.703966 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:28Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.708187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.708212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.708220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.708233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.708242 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.720315 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:28Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.723628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.723666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.723675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.723686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.723694 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.734928 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:28Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.738922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.738964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.738977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.738994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.739006 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.751268 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:28Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:28 crc kubenswrapper[4962]: E1003 12:50:28.751543 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.753373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.753408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.753419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.753434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.753445 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.855671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.855705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.855714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.855729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.855738 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.957945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.958203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.958268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.958338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:28 crc kubenswrapper[4962]: I1003 12:50:28.958407 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:28Z","lastTransitionTime":"2025-10-03T12:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.061576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.061861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.061979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.062107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.062233 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.164110 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.164156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.164171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.164192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.164207 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.204597 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:29 crc kubenswrapper[4962]: E1003 12:50:29.204761 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:29 crc kubenswrapper[4962]: E1003 12:50:29.204835 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs podName:f2989e38-d4e7-42c9-8959-f87168a4ac14 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:31.204816894 +0000 UTC m=+39.608714729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs") pod "network-metrics-daemon-5blzz" (UID: "f2989e38-d4e7-42c9-8959-f87168a4ac14") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.226436 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:29 crc kubenswrapper[4962]: E1003 12:50:29.226553 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.266350 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.266388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.266399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.266416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.266426 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.368400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.368442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.368451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.368465 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.368474 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.471181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.471238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.471249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.471263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.471271 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.573459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.573500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.573509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.573524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.573533 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.676173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.676224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.676240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.676261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.676276 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.778782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.778833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.778851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.778873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.778889 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.881989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.882025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.882034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.882054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.882064 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.984772 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.984816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.984825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.984839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:29 crc kubenswrapper[4962]: I1003 12:50:29.984848 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:29Z","lastTransitionTime":"2025-10-03T12:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.086978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.087049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.087065 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.087086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.087100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.189567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.189662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.189677 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.189697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.189707 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.226375 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.226450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:30 crc kubenswrapper[4962]: E1003 12:50:30.226494 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.226535 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:30 crc kubenswrapper[4962]: E1003 12:50:30.226740 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:30 crc kubenswrapper[4962]: E1003 12:50:30.226677 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.292326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.292376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.292388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.292405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.292419 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.399029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.399079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.399089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.399104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.399117 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.500969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.501002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.501009 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.501023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.501031 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.603676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.603910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.603944 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.603977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.604002 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.706430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.706469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.706482 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.706498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.706508 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.808957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.809006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.809019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.809036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.809047 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.911682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.911722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.911731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.911745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:30 crc kubenswrapper[4962]: I1003 12:50:30.911757 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:30Z","lastTransitionTime":"2025-10-03T12:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.014045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.014091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.014105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.014126 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.014144 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.116935 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.116991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.117014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.117064 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.117091 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.219311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.219344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.219357 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.219372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.219383 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.222854 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:31 crc kubenswrapper[4962]: E1003 12:50:31.222956 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:31 crc kubenswrapper[4962]: E1003 12:50:31.223014 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs podName:f2989e38-d4e7-42c9-8959-f87168a4ac14 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:35.222998005 +0000 UTC m=+43.626895840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs") pod "network-metrics-daemon-5blzz" (UID: "f2989e38-d4e7-42c9-8959-f87168a4ac14") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.226949 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:31 crc kubenswrapper[4962]: E1003 12:50:31.227091 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.321402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.321465 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.321486 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.321512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.321529 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.424030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.424069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.424077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.424091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.424100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.527758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.527822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.527844 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.527871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.527891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.629770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.629818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.629834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.629854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.629865 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.732529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.732560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.732571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.732589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.732601 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.835115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.835154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.835164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.835177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.835186 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.937726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.937767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.937779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.937795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:31 crc kubenswrapper[4962]: I1003 12:50:31.937807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:31Z","lastTransitionTime":"2025-10-03T12:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.039947 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.039987 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.039998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.040017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.040028 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.142407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.142473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.142486 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.142500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.142510 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.226743 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.226782 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:32 crc kubenswrapper[4962]: E1003 12:50:32.226869 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.226899 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:32 crc kubenswrapper[4962]: E1003 12:50:32.227069 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:32 crc kubenswrapper[4962]: E1003 12:50:32.227142 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.245503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.245827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.246005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.246157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.246297 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.247187 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.259180 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.275244 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.289393 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.301123 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.315955 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.340376 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.351427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.351488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.351500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.351516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.351527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.355142 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.368283 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.378366 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.389385 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.409219 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a463980f4ebcee263e55986db4af49d10d2a504b84316a79961f3f3a8d76ccb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:23Z\\\",\\\"message\\\":\\\"003 12:50:23.655440 6287 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.655594 6287 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 12:50:23.656112 6287 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:23.656153 6287 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 12:50:23.656159 6287 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:23.656170 6287 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:23.656176 6287 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:23.656198 6287 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:23.656206 6287 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:23.656215 6287 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:23.656221 6287 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:23.656228 6287 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:23.656432 6287 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.423687 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.437367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.448429 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.454014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.454045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.454056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.454072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.454083 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.462792 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.475291 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:32Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.555678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.555720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.555741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.555753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.555762 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.657856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.657909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.657923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.657938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.657948 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.760189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.760253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.760264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.760280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.760291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.863301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.863342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.863358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.863374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.863386 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.965572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.965609 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.965620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.965664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:32 crc kubenswrapper[4962]: I1003 12:50:32.965681 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:32Z","lastTransitionTime":"2025-10-03T12:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.069365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.069444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.069458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.069479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.069492 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.173530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.173580 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.173589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.173605 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.173615 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.226449 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:33 crc kubenswrapper[4962]: E1003 12:50:33.226679 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.276284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.276330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.276338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.276359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.276370 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.378960 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.378994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.379002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.379014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.379024 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.481175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.481211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.481220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.481235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.481245 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.583926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.583974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.583986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.584003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.584015 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.686274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.686330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.686345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.686366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.686380 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.788718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.788778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.788792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.788813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.788827 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.892063 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.892132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.892154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.892182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.892205 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.995065 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.995134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.995157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.995190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:33 crc kubenswrapper[4962]: I1003 12:50:33.995211 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:33Z","lastTransitionTime":"2025-10-03T12:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.097740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.097814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.097838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.097867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.097889 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.199915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.199953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.199963 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.199980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.199990 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.227049 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.227116 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.227048 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:34 crc kubenswrapper[4962]: E1003 12:50:34.227166 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:34 crc kubenswrapper[4962]: E1003 12:50:34.227230 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:34 crc kubenswrapper[4962]: E1003 12:50:34.227277 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.302801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.302854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.302871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.302894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.302915 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.405780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.405849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.405869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.405897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.405915 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.508822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.509181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.509382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.509567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.509786 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.612965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.612998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.613007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.613022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.613032 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.715095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.715158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.715179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.715201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.715215 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.816918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.816957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.816968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.816987 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.816999 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.919583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.919626 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.919660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.919679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:34 crc kubenswrapper[4962]: I1003 12:50:34.919692 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:34Z","lastTransitionTime":"2025-10-03T12:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.022210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.022247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.022258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.022274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.022285 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.124925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.124957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.124965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.124980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.124990 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.226182 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:35 crc kubenswrapper[4962]: E1003 12:50:35.226342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.227113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.227146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.227157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.227173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.227184 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.265433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:35 crc kubenswrapper[4962]: E1003 12:50:35.265562 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:35 crc kubenswrapper[4962]: E1003 12:50:35.265610 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs podName:f2989e38-d4e7-42c9-8959-f87168a4ac14 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:43.265596823 +0000 UTC m=+51.669494658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs") pod "network-metrics-daemon-5blzz" (UID: "f2989e38-d4e7-42c9-8959-f87168a4ac14") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.329214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.329248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.329256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.329268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.329278 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.432269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.432331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.432343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.432360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.432373 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.534716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.534761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.534771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.534789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.534798 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.575597 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.576386 4962 scope.go:117] "RemoveContainer" containerID="950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.596399 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.622144 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.639092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.639149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.639164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.639184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.639199 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.643129 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.655125 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.664754 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.676066 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.694194 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.707019 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.720130 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.730937 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.741318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.741388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.741398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.741413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.741424 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.744182 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.759696 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.773623 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.784021 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.797069 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.809318 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.819914 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:35Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.843738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.843774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.843784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.843796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.843806 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.945750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.945829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.945842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.945876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:35 crc kubenswrapper[4962]: I1003 12:50:35.945892 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:35Z","lastTransitionTime":"2025-10-03T12:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.048037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.048079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.048090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.048106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.048117 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.150495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.150532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.150541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.150557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.150568 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.226633 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.226711 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.226733 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:36 crc kubenswrapper[4962]: E1003 12:50:36.227014 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:36 crc kubenswrapper[4962]: E1003 12:50:36.227117 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:36 crc kubenswrapper[4962]: E1003 12:50:36.227199 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.252759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.252801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.252810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.252827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.252837 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.355786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.355825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.355836 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.355851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.355864 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.458323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.458361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.458371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.458384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.458392 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.497286 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/2.log" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.498122 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/1.log" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.500549 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a" exitCode=1 Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.500577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.500607 4962 scope.go:117] "RemoveContainer" containerID="950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.501330 4962 scope.go:117] "RemoveContainer" containerID="8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a" Oct 03 12:50:36 crc kubenswrapper[4962]: E1003 12:50:36.501518 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.511289 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.522839 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.532230 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.540948 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.550021 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.560716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.560754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.560764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.560784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.560795 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.566166 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.579316 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.593604 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.602797 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.613885 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.623687 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.636135 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.646983 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.658859 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.662670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.662724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.662734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.662751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.662760 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.680614 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.694766 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.707362 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:36Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.765232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.765281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.765294 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.765311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.765322 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.868029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.868073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.868084 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.868102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.868114 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.970165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.970223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.970237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.970255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:36 crc kubenswrapper[4962]: I1003 12:50:36.970266 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:36Z","lastTransitionTime":"2025-10-03T12:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.072552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.072877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.072952 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.073054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.073145 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.176254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.176527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.176619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.176754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.176832 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.226575 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:37 crc kubenswrapper[4962]: E1003 12:50:37.226739 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.278445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.278494 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.278510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.278530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.278548 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.381591 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.381701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.381727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.381763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.381788 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.484525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.484788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.484854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.484943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.485018 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.504701 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/2.log" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.587324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.587562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.587629 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.587762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.587834 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.689957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.689990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.690000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.690016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.690026 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.792256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.792285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.792293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.792306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.792315 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.894503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.894543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.894552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.894567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.894577 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.997699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.997747 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.997761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.997778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:37 crc kubenswrapper[4962]: I1003 12:50:37.997789 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:37Z","lastTransitionTime":"2025-10-03T12:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.100051 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.100089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.100097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.100109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.100118 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.202300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.202343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.202351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.202365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.202374 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.226485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.226549 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.226549 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:38 crc kubenswrapper[4962]: E1003 12:50:38.227004 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:38 crc kubenswrapper[4962]: E1003 12:50:38.227047 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:38 crc kubenswrapper[4962]: E1003 12:50:38.226874 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.305853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.305900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.305913 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.305931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.305942 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.408389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.408462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.408474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.408491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.408501 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.510477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.510520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.510530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.510545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.510556 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.612802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.612832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.612840 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.612854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.612864 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.715345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.715406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.715417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.715436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.715447 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.817431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.817464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.817474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.817488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.817498 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.919650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.919686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.919696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.919712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.919723 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.935934 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.935966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.935973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.935986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.935994 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: E1003 12:50:38.947229 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:38Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.950330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.950369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.950381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.950397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.950408 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:38 crc kubenswrapper[4962]: E1003 12:50:38.975099 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:38Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.979496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.979526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.979535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.979550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:38 crc kubenswrapper[4962]: I1003 12:50:38.979582 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:38Z","lastTransitionTime":"2025-10-03T12:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: E1003 12:50:39.010512 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:39Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.013979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.014024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.014037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.014056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.014067 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: E1003 12:50:39.025455 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:39Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.029007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.029042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.029050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.029063 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.029074 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: E1003 12:50:39.040066 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:39Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:39 crc kubenswrapper[4962]: E1003 12:50:39.040424 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.041718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.041828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.041900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.041967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.042029 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.144700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.144744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.144754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.144767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.144777 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.250170 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:39 crc kubenswrapper[4962]: E1003 12:50:39.250389 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.251531 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.251595 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.251612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.251656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.251672 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.354367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.354415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.354426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.354444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.354455 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.456955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.457003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.457016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.457035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.457049 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.560524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.560576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.560591 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.560611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.560626 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.663503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.663552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.663560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.663575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.663586 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.766829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.766909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.766922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.766944 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.766958 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.872428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.872508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.872524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.872548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.872570 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.976087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.976143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.976159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.976182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:39 crc kubenswrapper[4962]: I1003 12:50:39.976200 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:39Z","lastTransitionTime":"2025-10-03T12:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.079200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.079516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.079616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.079743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.079841 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.181830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.181869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.181879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.181897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.181955 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.226337 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.226375 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:40 crc kubenswrapper[4962]: E1003 12:50:40.226484 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.226535 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:40 crc kubenswrapper[4962]: E1003 12:50:40.226611 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:40 crc kubenswrapper[4962]: E1003 12:50:40.226690 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.284118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.284160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.284168 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.284181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.284191 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.386458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.386500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.386511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.386527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.386539 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.489007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.489052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.489064 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.489081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.489093 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.591414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.591453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.591464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.591480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.591491 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.693372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.693426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.693437 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.693452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.693461 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.795682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.795714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.795723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.795737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.795745 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.898511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.898566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.898578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.898596 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:40 crc kubenswrapper[4962]: I1003 12:50:40.898611 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:40Z","lastTransitionTime":"2025-10-03T12:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.000993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.001039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.001050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.001068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.001080 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.103006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.103047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.103058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.103076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.103086 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.206073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.206120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.206131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.206147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.206160 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.226746 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:41 crc kubenswrapper[4962]: E1003 12:50:41.226904 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.310300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.310335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.310347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.310362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.310370 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.412928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.412978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.412988 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.413005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.413020 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.515630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.515692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.515700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.515715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.515724 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.618056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.618093 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.618101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.618116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.618124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.719675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.719892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.719907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.719922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.719932 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.822106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.822157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.822168 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.822185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.822194 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.923900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.923956 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.923974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.923991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:41 crc kubenswrapper[4962]: I1003 12:50:41.924006 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:41Z","lastTransitionTime":"2025-10-03T12:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.025487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.025534 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.025544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.025558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.025567 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.127925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.127981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.127990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.128004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.128014 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.226331 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.226432 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:42 crc kubenswrapper[4962]: E1003 12:50:42.226473 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.226335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:42 crc kubenswrapper[4962]: E1003 12:50:42.226552 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:42 crc kubenswrapper[4962]: E1003 12:50:42.226684 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.232929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.232973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.232984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.233006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.233021 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.240615 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.251936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.263423 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.273789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.282972 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.299028 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.309939 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.322647 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.333447 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.334623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.334675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.334693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.334710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.334721 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.349000 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.359534 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.369333 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.377675 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.386115 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.400153 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.411490 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.421367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:42Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.437112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.437306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.437399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.437489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.437566 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.539778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.540116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.540219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.540304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.540382 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.642879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.642927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.642943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.642976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.642991 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.744824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.744869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.744883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.744899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.744909 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.847256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.847296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.847306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.847323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.847334 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.949789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.949835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.949851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.949874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:42 crc kubenswrapper[4962]: I1003 12:50:42.949883 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:42Z","lastTransitionTime":"2025-10-03T12:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.052250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.052300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.052311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.052336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.052348 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.154879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.154916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.154926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.154941 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.154953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.226295 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.226436 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.257143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.257189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.257202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.257220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.257234 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.343325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.343452 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.343502 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs podName:f2989e38-d4e7-42c9-8959-f87168a4ac14 nodeName:}" failed. No retries permitted until 2025-10-03 12:50:59.343487322 +0000 UTC m=+67.747385157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs") pod "network-metrics-daemon-5blzz" (UID: "f2989e38-d4e7-42c9-8959-f87168a4ac14") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.358780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.358806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.358814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.358827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.358843 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.461311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.461350 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.461361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.461375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.461385 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.563286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.563326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.563336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.563352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.563363 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.665664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.665712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.665723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.665738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.665748 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.768049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.768085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.768094 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.768108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.768118 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.849184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.849370 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:15.849331388 +0000 UTC m=+84.253229253 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.871124 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.871344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.871365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.871399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.871423 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.951139 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.951268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.951335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.951425 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951460 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951556 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951581 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:51:15.951551494 +0000 UTC m=+84.355449349 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951594 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951598 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951609 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951560 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951733 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 12:51:15.951712139 +0000 UTC m=+84.355609974 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951738 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951758 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:51:15.9517484 +0000 UTC m=+84.355646465 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951772 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:43 crc kubenswrapper[4962]: E1003 12:50:43.951869 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 12:51:15.951834282 +0000 UTC m=+84.355732157 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.974382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.974415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.974425 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.974442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:43 crc kubenswrapper[4962]: I1003 12:50:43.974453 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:43Z","lastTransitionTime":"2025-10-03T12:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.076993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.077027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.077035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.077047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.077056 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.179820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.179886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.179904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.179932 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.179951 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.227277 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.227294 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.227328 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:44 crc kubenswrapper[4962]: E1003 12:50:44.227541 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:44 crc kubenswrapper[4962]: E1003 12:50:44.227688 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:44 crc kubenswrapper[4962]: E1003 12:50:44.227775 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.282759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.282813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.282823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.282846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.282860 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.386569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.386707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.386740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.386777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.386809 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.490320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.490372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.490385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.490402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.490414 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.593593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.593675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.593687 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.593705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.593717 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.696607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.696710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.696721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.696742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.696753 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.798881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.798935 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.798946 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.798965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.798977 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.902218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.902320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.902344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.902377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:44 crc kubenswrapper[4962]: I1003 12:50:44.902412 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:44Z","lastTransitionTime":"2025-10-03T12:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.006145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.006212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.006226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.006246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.006260 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.108476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.108513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.108521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.108533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.108541 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.210616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.210731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.210751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.210773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.210788 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.226066 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:45 crc kubenswrapper[4962]: E1003 12:50:45.226198 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.312956 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.313005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.313019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.313041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.313057 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.415814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.415855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.415865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.415915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.415926 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.518743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.518781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.518789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.518802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.518811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.620657 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.620699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.620717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.620735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.620745 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.723134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.723191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.723209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.723226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.723239 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.826191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.826239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.826251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.826266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.826276 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.928450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.928482 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.928491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.928504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:45 crc kubenswrapper[4962]: I1003 12:50:45.928515 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:45Z","lastTransitionTime":"2025-10-03T12:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.031155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.031200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.031212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.031232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.031244 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.041267 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.048392 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.054897 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.067547 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.077920 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.089164 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.100554 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.111475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.122667 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.134263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.134295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.134303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.134317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.134328 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.134954 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.153018 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.168192 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.181875 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.192917 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.201697 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.210494 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.221093 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.227046 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.227091 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:46 crc kubenswrapper[4962]: E1003 12:50:46.227144 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.227183 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:46 crc kubenswrapper[4962]: E1003 12:50:46.227281 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:46 crc kubenswrapper[4962]: E1003 12:50:46.227357 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.236601 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.236660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.236670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.236686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.236697 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.240050 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950cd712bed0d7eafeb1553551ccbf80f1ade9c74ceb2b4f62833dc77d08be00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:25Z\\\",\\\"message\\\":\\\"led to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:25Z is after 2025-08-24T17:21:41Z]\\\\nI1003 12:50:25.167923 6413 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.250198 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:46Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.339579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.339674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.339695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.339722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.339740 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.442230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.442272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.442281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.442295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.442304 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.544443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.544496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.544509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.544523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.544536 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.646930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.646965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.646982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.647000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.647012 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.749225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.749277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.749293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.749310 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.749323 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.851086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.851151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.851169 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.851184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.851197 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.953389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.953430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.953442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.953458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:46 crc kubenswrapper[4962]: I1003 12:50:46.953469 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:46Z","lastTransitionTime":"2025-10-03T12:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.056031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.056112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.056122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.056138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.056150 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.158346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.158391 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.158402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.158418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.158428 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.226500 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:47 crc kubenswrapper[4962]: E1003 12:50:47.226630 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.261095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.261147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.261159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.261176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.261187 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.363144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.363184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.363194 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.363209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.363219 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.465276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.465316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.465325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.465339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.465350 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.567773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.567812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.567824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.567839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.567847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.670236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.670273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.670284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.670297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.670305 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.772600 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.772662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.772674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.772694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.772705 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.874717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.874851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.874871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.874900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.874912 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.977423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.977453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.977460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.977475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:47 crc kubenswrapper[4962]: I1003 12:50:47.977484 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:47Z","lastTransitionTime":"2025-10-03T12:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.079418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.079463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.079495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.079532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.079544 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.182141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.182189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.182196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.182214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.182243 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.227038 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.227047 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.227099 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:48 crc kubenswrapper[4962]: E1003 12:50:48.227177 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:48 crc kubenswrapper[4962]: E1003 12:50:48.227284 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:48 crc kubenswrapper[4962]: E1003 12:50:48.227350 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.284878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.284932 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.284942 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.284988 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.285004 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.387259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.387295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.387303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.387317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.387326 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.489986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.490013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.490021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.490033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.490041 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.592692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.592733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.592741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.592756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.592767 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.694941 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.694985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.694996 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.695012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.695024 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.797764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.797816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.797825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.797866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.797886 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.899978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.900015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.900024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.900036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:48 crc kubenswrapper[4962]: I1003 12:50:48.900046 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:48Z","lastTransitionTime":"2025-10-03T12:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.002790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.003022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.003104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.003185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.003248 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.105151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.105725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.105791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.105859 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.105947 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.186167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.186479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.186566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.186683 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.186791 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: E1003 12:50:49.197519 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:49Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.201131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.201180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.201192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.201208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.201219 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: E1003 12:50:49.211500 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:49Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.214699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.214726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.214738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.214754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.214765 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: E1003 12:50:49.225146 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:49Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.226516 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:49 crc kubenswrapper[4962]: E1003 12:50:49.226615 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.228829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.228885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.228901 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.228926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.228942 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: E1003 12:50:49.240961 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:49Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.244134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.244168 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.244176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.244191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.244199 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: E1003 12:50:49.256320 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:49Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:49 crc kubenswrapper[4962]: E1003 12:50:49.256659 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.258422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.258539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.258602 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.258701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.258798 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.360237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.360274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.360287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.360301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.360312 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.462466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.462783 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.462890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.462988 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.463088 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.566320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.566379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.566392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.566413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.566462 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.668468 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.668506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.668516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.668532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.668543 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.771034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.771082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.771095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.771113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.771124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.873850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.873891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.873899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.873913 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.873921 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.976327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.976377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.976389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.976408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:49 crc kubenswrapper[4962]: I1003 12:50:49.976421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:49Z","lastTransitionTime":"2025-10-03T12:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.078517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.078557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.078570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.078586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.078597 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.181358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.181398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.181407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.181422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.181432 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.227023 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.227068 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:50 crc kubenswrapper[4962]: E1003 12:50:50.227219 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.227238 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:50 crc kubenswrapper[4962]: E1003 12:50:50.227358 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:50 crc kubenswrapper[4962]: E1003 12:50:50.227425 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.227952 4962 scope.go:117] "RemoveContainer" containerID="8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a" Oct 03 12:50:50 crc kubenswrapper[4962]: E1003 12:50:50.228147 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.241489 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.252781 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.263888 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.273465 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.283446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.283475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.283484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.283497 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.283506 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.286385 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.303470 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.316935 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.328876 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.344347 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.366944 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.378223 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.385461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.385667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.385739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.385811 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.385874 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.390184 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.400120 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.410237 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.421848 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.435451 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.449941 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.461943 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:50Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.488263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.488312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.488324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.488345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.488359 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.590413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.590484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.590495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.590508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.590517 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.692864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.692891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.692900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.692912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.692921 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.796219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.796284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.796298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.796319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.796523 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.898994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.899061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.899074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.899091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:50 crc kubenswrapper[4962]: I1003 12:50:50.899102 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:50Z","lastTransitionTime":"2025-10-03T12:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.001625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.001687 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.001696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.001711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.001720 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.103618 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.103673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.103681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.103696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.103705 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.206259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.206295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.206307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.206322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.206332 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.226836 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:51 crc kubenswrapper[4962]: E1003 12:50:51.226985 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.308403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.308457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.308466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.308482 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.308491 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.410556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.410589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.410598 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.410612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.410621 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.513897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.513977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.514001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.514031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.514048 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.617797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.617866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.617883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.617909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.617926 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.721823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.721881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.721898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.721922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.721937 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.824707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.824777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.824787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.824802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.824811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.926969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.927012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.927026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.927042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:51 crc kubenswrapper[4962]: I1003 12:50:51.927053 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:51Z","lastTransitionTime":"2025-10-03T12:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.031123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.031560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.031882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.031987 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.032079 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.134744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.134773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.134780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.134792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.134803 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.226138 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.226210 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:52 crc kubenswrapper[4962]: E1003 12:50:52.226605 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.226259 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:52 crc kubenswrapper[4962]: E1003 12:50:52.226773 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:52 crc kubenswrapper[4962]: E1003 12:50:52.226860 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.238008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.238059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.238077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.238136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.238157 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.240831 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.253268 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.266233 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.278177 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.289974 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.307923 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.322846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.338783 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.340321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.340397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.340418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.340450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.340474 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.358336 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.372025 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.383207 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.394556 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.407673 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.432062 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.443139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.443191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.443205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.443225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.443238 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.445919 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.461141 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.475015 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.486665 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:52Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.546170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.546216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.546226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.546243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.546254 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.648708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.648741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.648749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.648762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.648772 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.755152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.755271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.755317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.755351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.755376 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.858273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.858311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.858319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.858331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.858340 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.961137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.961174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.961263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.961279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:52 crc kubenswrapper[4962]: I1003 12:50:52.961289 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:52Z","lastTransitionTime":"2025-10-03T12:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.063951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.064045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.064057 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.064075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.064086 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.166849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.166891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.166899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.166914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.166929 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.226600 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:53 crc kubenswrapper[4962]: E1003 12:50:53.226757 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.269891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.269923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.269932 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.269944 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.269953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.371695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.371775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.371788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.371804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.371813 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.474216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.474351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.474365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.474382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.474394 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.576245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.576285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.576297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.576314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.576326 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.677751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.677800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.677808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.677821 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.677830 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.780520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.780574 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.780586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.780604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.780616 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.882489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.882562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.882621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.882693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.882720 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.985153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.985211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.985236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.985264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:53 crc kubenswrapper[4962]: I1003 12:50:53.985285 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:53Z","lastTransitionTime":"2025-10-03T12:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.088006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.088050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.088061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.088081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.088092 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.190791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.190834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.190887 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.190909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.190924 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.226150 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.226150 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.226277 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:54 crc kubenswrapper[4962]: E1003 12:50:54.226458 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:54 crc kubenswrapper[4962]: E1003 12:50:54.226564 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:54 crc kubenswrapper[4962]: E1003 12:50:54.226624 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.293164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.293743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.293835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.293926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.294038 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.396984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.397207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.397276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.397336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.397398 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.501410 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.501457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.501469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.501489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.501502 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.603773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.604026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.604099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.604178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.604271 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.706538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.706579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.706588 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.706602 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.706612 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.809050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.809334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.809422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.809502 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.809586 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.911608 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.911699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.911716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.911740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:54 crc kubenswrapper[4962]: I1003 12:50:54.911755 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:54Z","lastTransitionTime":"2025-10-03T12:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.014119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.014158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.014169 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.014187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.014198 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.115953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.116162 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.116251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.116359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.116429 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.218909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.218953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.218965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.218981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.218993 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.226177 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:55 crc kubenswrapper[4962]: E1003 12:50:55.226288 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.321433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.321474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.321484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.321498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.321507 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.423511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.423552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.423562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.423576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.423585 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.526867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.527257 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.527399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.527494 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.527597 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.630259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.630310 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.630323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.630340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.630352 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.733337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.733404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.733417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.733441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.733457 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.836726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.836925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.837024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.837126 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.837184 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.940849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.940891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.940901 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.940920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:55 crc kubenswrapper[4962]: I1003 12:50:55.940932 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:55Z","lastTransitionTime":"2025-10-03T12:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.043524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.043794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.043889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.043992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.044081 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.147305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.147340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.147349 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.147363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.147372 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.226249 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.226279 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:56 crc kubenswrapper[4962]: E1003 12:50:56.226401 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:56 crc kubenswrapper[4962]: E1003 12:50:56.226486 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.226279 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:56 crc kubenswrapper[4962]: E1003 12:50:56.226780 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.250080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.250142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.250155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.250170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.250180 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.352545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.352604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.352615 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.352631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.352660 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.454291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.454333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.454348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.454364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.454375 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.557701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.557979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.558040 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.558101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.558179 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.660114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.660159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.660167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.660181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.660191 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.762516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.762769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.762863 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.762924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.762988 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.865498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.865542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.865555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.865571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.865583 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.967874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.967920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.967931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.967946 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:56 crc kubenswrapper[4962]: I1003 12:50:56.967956 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:56Z","lastTransitionTime":"2025-10-03T12:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.070593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.070652 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.070666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.070681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.070691 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.172597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.172650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.172664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.172680 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.172689 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.226509 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:57 crc kubenswrapper[4962]: E1003 12:50:57.226654 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.275463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.275516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.275530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.275551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.275562 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.378398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.378442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.378452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.378469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.378480 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.480621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.480717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.480732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.480752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.480764 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.583397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.583447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.583463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.583479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.583496 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.686214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.686260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.686270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.686283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.686293 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.788620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.788675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.788685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.788698 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.788707 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.891033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.891082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.891097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.891117 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.891129 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.992825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.992866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.992878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.992894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:57 crc kubenswrapper[4962]: I1003 12:50:57.992933 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:57Z","lastTransitionTime":"2025-10-03T12:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.095160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.095204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.095215 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.095229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.095239 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.197561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.197597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.197606 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.197620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.197662 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.227149 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.227212 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.227168 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:50:58 crc kubenswrapper[4962]: E1003 12:50:58.227320 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:50:58 crc kubenswrapper[4962]: E1003 12:50:58.227523 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:50:58 crc kubenswrapper[4962]: E1003 12:50:58.227611 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.300685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.300726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.300736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.300751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.300760 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.402848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.402878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.402889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.402903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.402913 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.505366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.505417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.505425 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.505441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.505450 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.606995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.607033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.607044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.607060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.607070 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.709471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.709499 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.709507 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.709520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.709528 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.812048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.812084 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.812111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.812123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.812134 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.914085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.914349 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.914423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.914491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:58 crc kubenswrapper[4962]: I1003 12:50:58.914567 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:58Z","lastTransitionTime":"2025-10-03T12:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.017175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.017222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.017233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.017251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.017264 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.119929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.120071 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.120140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.120213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.120286 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.222258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.222295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.222303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.222316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.222326 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.226529 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.226819 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.324596 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.324905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.325005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.325095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.325170 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.401385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.401426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.401436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.401451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.401462 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.405489 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.405606 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.405688 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs podName:f2989e38-d4e7-42c9-8959-f87168a4ac14 nodeName:}" failed. No retries permitted until 2025-10-03 12:51:31.405673512 +0000 UTC m=+99.809571347 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs") pod "network-metrics-daemon-5blzz" (UID: "f2989e38-d4e7-42c9-8959-f87168a4ac14") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.413158 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:59Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.416432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.416479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.416513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.416528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.416538 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.428070 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:59Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.431338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.431373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.431385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.431401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.431410 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.442053 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:59Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.446327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.446381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.446395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.446414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.446431 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.459478 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:59Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.462981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.463115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.463262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.463399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.463545 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.475453 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:50:59Z is after 2025-08-24T17:21:41Z" Oct 03 12:50:59 crc kubenswrapper[4962]: E1003 12:50:59.475616 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.478281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.478318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.478328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.478344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.478356 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.581080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.581129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.581143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.581188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.581202 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.683338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.683377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.683385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.683400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.683410 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.786450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.786482 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.786492 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.786508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.786519 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.889001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.889046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.889058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.889074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.889084 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.991823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.991852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.991863 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.991875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:50:59 crc kubenswrapper[4962]: I1003 12:50:59.991885 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:50:59Z","lastTransitionTime":"2025-10-03T12:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.093800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.093839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.093851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.093867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.093877 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.195792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.195822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.195832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.195845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.195854 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.227050 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.227105 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.227105 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:00 crc kubenswrapper[4962]: E1003 12:51:00.227180 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:00 crc kubenswrapper[4962]: E1003 12:51:00.227313 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:00 crc kubenswrapper[4962]: E1003 12:51:00.227390 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.298398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.298451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.298464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.298481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.298493 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.401036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.401097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.401109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.401124 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.401134 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.503154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.503217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.503229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.503248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.503259 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.605467 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.605512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.605521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.605536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.605546 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.707727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.707771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.707781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.707795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.707807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.810754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.810785 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.810794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.810807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.810815 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.912954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.913006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.913020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.913043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:00 crc kubenswrapper[4962]: I1003 12:51:00.913058 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:00Z","lastTransitionTime":"2025-10-03T12:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.016014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.016081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.016095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.016112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.016123 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.118215 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.118250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.118261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.118276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.118287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.221199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.221260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.221270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.221296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.221310 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.226528 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:01 crc kubenswrapper[4962]: E1003 12:51:01.226733 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.228148 4962 scope.go:117] "RemoveContainer" containerID="8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.325923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.326440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.326453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.326475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.326490 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.434348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.434391 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.434401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.434415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.434426 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.537873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.537917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.537928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.537943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.537956 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.573991 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sdd6t_fbc64268-3e78-44a2-8116-b62b5c13f005/kube-multus/0.log" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.574052 4962 generic.go:334] "Generic (PLEG): container finished" podID="fbc64268-3e78-44a2-8116-b62b5c13f005" containerID="087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e" exitCode=1 Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.574098 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sdd6t" event={"ID":"fbc64268-3e78-44a2-8116-b62b5c13f005","Type":"ContainerDied","Data":"087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.574699 4962 scope.go:117] "RemoveContainer" containerID="087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.576416 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/2.log" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.579882 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.580470 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.590610 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.603789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"2025-10-03T12:50:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425\\\\n2025-10-03T12:50:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425 to /host/opt/cni/bin/\\\\n2025-10-03T12:50:16Z [verbose] multus-daemon started\\\\n2025-10-03T12:50:16Z [verbose] Readiness Indicator file check\\\\n2025-10-03T12:51:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.618106 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.632911 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.640412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.640457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.640471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.640489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.640502 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.650144 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.670347 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.685850 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.699781 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.722389 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.737543 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.742315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.742346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.742359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.742375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.742387 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.752248 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.770297 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.786405 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.800035 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.815491 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.828128 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.845148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.845187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.845198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.845214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.845225 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.848682 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.861461 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.880559 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.894266 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"2025-10-03T12:50:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425\\\\n2025-10-03T12:50:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425 to /host/opt/cni/bin/\\\\n2025-10-03T12:50:16Z [verbose] multus-daemon started\\\\n2025-10-03T12:50:16Z [verbose] Readiness Indicator file check\\\\n2025-10-03T12:51:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.908210 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.924046 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.941864 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.947178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.947245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.947259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.947274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.947601 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:01Z","lastTransitionTime":"2025-10-03T12:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.959223 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.971685 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:01 crc kubenswrapper[4962]: I1003 12:51:01.982647 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:01Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.004331 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.023869 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.034884 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.047483 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.049881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.049906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.049915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.049930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.049941 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.067134 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.079287 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.091928 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.102495 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.115474 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.125749 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.152235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.152270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.152280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.152294 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.152303 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.227017 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.227021 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:02 crc kubenswrapper[4962]: E1003 12:51:02.227181 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.227043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:02 crc kubenswrapper[4962]: E1003 12:51:02.227340 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:02 crc kubenswrapper[4962]: E1003 12:51:02.227430 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.241444 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.253317 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"2025-10-03T12:50:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425\\\\n2025-10-03T12:50:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425 to /host/opt/cni/bin/\\\\n2025-10-03T12:50:16Z [verbose] multus-daemon started\\\\n2025-10-03T12:50:16Z [verbose] Readiness Indicator file check\\\\n2025-10-03T12:51:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.254572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.254607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.254616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.254630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.254653 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.263123 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.275145 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.287520 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.299307 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.309268 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.318948 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.337705 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.349618 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.356246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.356283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.356293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.356310 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.356321 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.361273 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.374343 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.385795 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.396667 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.406283 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.417829 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.435971 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.446654 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.458273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.458324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.458334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.458348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.458358 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.560145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.560195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.560205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.560222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.560233 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.584355 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/3.log" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.584890 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/2.log" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.586852 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" exitCode=1 Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.586920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.586959 4962 scope.go:117] "RemoveContainer" containerID="8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.587709 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 12:51:02 crc kubenswrapper[4962]: E1003 12:51:02.587958 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.589536 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sdd6t_fbc64268-3e78-44a2-8116-b62b5c13f005/kube-multus/0.log" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.589569 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sdd6t" event={"ID":"fbc64268-3e78-44a2-8116-b62b5c13f005","Type":"ContainerStarted","Data":"55381b2122f7d63231ef917dc3901e367b66dd7e75eb6fb7c3d049b81113c77f"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.601416 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.612925 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"2025-10-03T12:50:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425\\\\n2025-10-03T12:50:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425 to /host/opt/cni/bin/\\\\n2025-10-03T12:50:16Z [verbose] multus-daemon started\\\\n2025-10-03T12:50:16Z [verbose] Readiness Indicator file check\\\\n2025-10-03T12:51:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.623551 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.633060 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.643292 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.654207 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.662260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.662282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.662290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.662302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.662311 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.664885 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.675472 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.688056 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.704358 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.715770 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.725048 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.733397 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.742732 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.757709 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"message\\\":\\\"089997 6951 lb_config.go:1031] Cluster endpoints for openshift-kube-storage-version-migrator-operator/metrics for network=default are: map[]\\\\nI1003 12:51:02.088389 6951 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI1003 12:51:02.090054 6951 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 3.971777ms\\\\nI1003 12:51:02.090075 6951 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1003 12:51:02.089757 6951 ovnkube.go:599] Stopped ovnkube\\\\nI1003 12:51:02.089611 6951 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI1003 12:51:02.090155 6951 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 12:51:02.090195 6951 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1003 12:51:02.090220 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.764220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.764264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.764275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.764291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.764302 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.767184 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.778210 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.787130 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.799579 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.812504 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55381b2122f7d63231ef917dc3901e367b66dd7e75eb6fb7c3d049b81113c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"2025-10-03T12:50:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425\\\\n2025-10-03T12:50:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425 to /host/opt/cni/bin/\\\\n2025-10-03T12:50:16Z [verbose] multus-daemon started\\\\n2025-10-03T12:50:16Z [verbose] Readiness Indicator file check\\\\n2025-10-03T12:51:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.823146 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.833459 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.843704 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.855613 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.866193 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.866223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.866235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.866250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.866258 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.867061 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.888964 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.924229 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.943161 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.958424 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.968226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.968259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.968268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.968286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.968300 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:02Z","lastTransitionTime":"2025-10-03T12:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.968789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:02 crc kubenswrapper[4962]: I1003 12:51:02.982187 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:02Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.023910 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.069410 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdd4ae057f0e89cf8aeff9645035e668883b7bb9ce4d6d083a3e64f6f7fc12a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:50:36Z\\\",\\\"message\\\":\\\"val\\\\nI1003 12:50:36.331444 6633 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 12:50:36.331482 6633 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 12:50:36.331508 6633 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 12:50:36.331546 6633 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 12:50:36.331576 6633 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 12:50:36.331603 6633 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 12:50:36.331630 6633 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 12:50:36.332846 6633 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 12:50:36.332881 6633 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 12:50:36.332887 6633 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 12:50:36.332916 6633 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 12:50:36.332927 6633 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 12:50:36.332923 6633 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 12:50:36.332968 6633 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 12:50:36.332969 6633 factory.go:656] Stopping watch factory\\\\nI1003 12:50:36.332975 6633 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"message\\\":\\\"089997 6951 lb_config.go:1031] Cluster endpoints for openshift-kube-storage-version-migrator-operator/metrics for network=default are: map[]\\\\nI1003 12:51:02.088389 6951 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI1003 12:51:02.090054 6951 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 3.971777ms\\\\nI1003 12:51:02.090075 6951 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1003 12:51:02.089757 6951 ovnkube.go:599] Stopped ovnkube\\\\nI1003 12:51:02.089611 6951 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI1003 12:51:02.090155 6951 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 12:51:02.090195 6951 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1003 12:51:02.090220 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.070942 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.070973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.070982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.070995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.071003 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.102997 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.145480 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.173504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.173536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.173546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.173561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.173571 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.184109 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.226923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:03 crc kubenswrapper[4962]: E1003 12:51:03.227043 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.236984 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.275846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.275897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.275906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.275921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.275931 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.378344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.378384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.378395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.378412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.378425 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.480358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.480408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.480422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.480441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.480454 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.582574 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.582606 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.582614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.582627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.582649 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.594297 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/3.log" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.599131 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 12:51:03 crc kubenswrapper[4962]: E1003 12:51:03.599364 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.613312 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a38f90-f921-463b-8484-156bdbff17d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ad2b9fc4c53924d160f9bcbf329977550ed3f9e5724a7930bc19b137f412208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f0e7ddaee0852f8955a31ea974c460c5af2dae4ea15f6fa68d65fe9f0d63e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0f0e7ddaee0852f8955a31ea974c460c5af2dae4ea15f6fa68d65fe9f0d63e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.627145 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.638450 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.649375 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.659731 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.678880 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"message\\\":\\\"089997 6951 lb_config.go:1031] Cluster endpoints for openshift-kube-storage-version-migrator-operator/metrics for network=default are: map[]\\\\nI1003 12:51:02.088389 6951 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI1003 12:51:02.090054 6951 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 3.971777ms\\\\nI1003 12:51:02.090075 6951 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1003 12:51:02.089757 6951 ovnkube.go:599] Stopped ovnkube\\\\nI1003 12:51:02.089611 6951 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI1003 12:51:02.090155 6951 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 12:51:02.090195 6951 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1003 12:51:02.090220 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.685101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.685130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.685138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.685151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.685160 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.690751 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.703810 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.717219 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55381b2122f7d63231ef917dc3901e367b66dd7e75eb6fb7c3d049b81113c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"2025-10-03T12:50:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425\\\\n2025-10-03T12:50:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425 to /host/opt/cni/bin/\\\\n2025-10-03T12:50:16Z [verbose] multus-daemon started\\\\n2025-10-03T12:50:16Z [verbose] Readiness Indicator file check\\\\n2025-10-03T12:51:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.726862 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.738735 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.751345 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.784646 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.787037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.787059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.787070 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.787081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.787092 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.795550 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.806103 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.850716 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.885972 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.893368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.893491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.893534 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.893557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.893577 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.923715 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.965032 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:03Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.995326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.995368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.995380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.995394 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:03 crc kubenswrapper[4962]: I1003 12:51:03.995406 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:03Z","lastTransitionTime":"2025-10-03T12:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.097955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.098002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.098014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.098035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.098046 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.201068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.201109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.201119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.201138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.201183 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.226433 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.226499 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.226455 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:04 crc kubenswrapper[4962]: E1003 12:51:04.226591 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:04 crc kubenswrapper[4962]: E1003 12:51:04.226808 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:04 crc kubenswrapper[4962]: E1003 12:51:04.226854 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.303693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.303731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.303744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.303795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.303809 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.406481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.406541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.406554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.406574 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.406587 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.509207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.509248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.509260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.509275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.509315 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.611414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.611465 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.611496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.611512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.611523 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.713653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.713688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.713698 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.713712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.713721 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.815541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.815577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.815589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.815606 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.815619 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.918006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.918061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.918071 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.918089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:04 crc kubenswrapper[4962]: I1003 12:51:04.918098 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:04Z","lastTransitionTime":"2025-10-03T12:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.020099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.020151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.020162 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.020178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.020193 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.122464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.122504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.122515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.122529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.122538 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.225212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.225253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.225264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.225282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.225293 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.226382 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:05 crc kubenswrapper[4962]: E1003 12:51:05.226492 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.328029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.328060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.328067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.328081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.328090 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.430442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.430487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.430502 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.430519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.430528 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.532880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.532917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.532925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.532940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.532954 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.635295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.635332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.635346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.635362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.635372 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.737761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.737794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.737802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.737816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.737828 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.840857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.840907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.840916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.840937 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.840948 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.944007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.944049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.944059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.944076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:05 crc kubenswrapper[4962]: I1003 12:51:05.944087 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:05Z","lastTransitionTime":"2025-10-03T12:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.046523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.046579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.046593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.046613 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.046625 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.149002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.149061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.149071 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.149090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.149103 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.227020 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.227142 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:06 crc kubenswrapper[4962]: E1003 12:51:06.227213 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.227280 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:06 crc kubenswrapper[4962]: E1003 12:51:06.227368 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:06 crc kubenswrapper[4962]: E1003 12:51:06.227502 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.251967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.252030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.252043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.252066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.252081 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.354298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.354347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.354359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.354380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.354397 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.456878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.456919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.456929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.456943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.456953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.559328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.559564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.559689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.559798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.559878 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.662522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.662548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.662556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.662571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.662592 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.765476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.765518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.765526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.765542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.765554 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.868178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.868218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.868227 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.868240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.868249 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.970730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.970779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.970792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.970808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:06 crc kubenswrapper[4962]: I1003 12:51:06.970820 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:06Z","lastTransitionTime":"2025-10-03T12:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.074163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.074239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.074263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.074297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.074318 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.176933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.176977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.176989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.177011 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.177026 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.226997 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:07 crc kubenswrapper[4962]: E1003 12:51:07.227287 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.279809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.279885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.279899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.279922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.279938 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.382460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.382513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.382526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.382543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.382553 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.484848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.484897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.484913 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.484930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.484941 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.587397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.587441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.587452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.587473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.587487 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.690615 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.690697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.690705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.690722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.690731 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.793340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.793394 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.793402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.793416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.793425 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.895514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.895553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.895565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.895580 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.895591 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.997922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.997953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.997962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.997974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:07 crc kubenswrapper[4962]: I1003 12:51:07.997983 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:07Z","lastTransitionTime":"2025-10-03T12:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.100236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.100275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.100287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.100302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.100311 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.202476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.202533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.202547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.202563 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.202573 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.226765 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.226794 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:08 crc kubenswrapper[4962]: E1003 12:51:08.226901 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.227137 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:08 crc kubenswrapper[4962]: E1003 12:51:08.227228 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:08 crc kubenswrapper[4962]: E1003 12:51:08.227324 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.305024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.305104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.305119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.305147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.305176 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.408090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.408165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.408179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.408208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.408222 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.510311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.510346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.510359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.510375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.510387 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.612236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.612282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.612292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.612308 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.612321 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.716080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.716132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.716144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.716163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.716177 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.818839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.818872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.818881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.819083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.819094 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.921929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.921980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.921992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.922012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:08 crc kubenswrapper[4962]: I1003 12:51:08.922026 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:08Z","lastTransitionTime":"2025-10-03T12:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.024835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.024872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.024883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.024897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.024905 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.127291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.127329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.127359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.127373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.127382 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.226584 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:09 crc kubenswrapper[4962]: E1003 12:51:09.226733 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.228870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.228895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.228903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.228914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.228923 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.331372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.331414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.331423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.331437 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.331447 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.433590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.433658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.433674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.433689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.433700 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.536276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.536332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.536345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.536364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.536378 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.639798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.639872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.639895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.639928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.639950 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.743142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.743224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.743249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.743281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.743303 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.845687 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.845729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.845739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.845752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.845761 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.848460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.848524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.848534 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.848546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.848554 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: E1003 12:51:09.860812 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:09Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.864612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.864692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.864706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.864724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.864736 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: E1003 12:51:09.878696 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:09Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.882861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.882895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.882907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.882927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.882940 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: E1003 12:51:09.894656 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:09Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.899396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.899429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.899439 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.899454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.899466 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: E1003 12:51:09.909489 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:09Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.912403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.912434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.912442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.912455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.912464 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:09 crc kubenswrapper[4962]: E1003 12:51:09.923933 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:09Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:09 crc kubenswrapper[4962]: E1003 12:51:09.924144 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.947970 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.948014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.948023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.948039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:09 crc kubenswrapper[4962]: I1003 12:51:09.948050 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:09Z","lastTransitionTime":"2025-10-03T12:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.050257 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.050292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.050300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.050312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.050322 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.152072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.152107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.152115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.152129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.152138 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.227010 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.227081 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.227010 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:10 crc kubenswrapper[4962]: E1003 12:51:10.227179 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:10 crc kubenswrapper[4962]: E1003 12:51:10.227305 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:10 crc kubenswrapper[4962]: E1003 12:51:10.227345 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.254665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.254714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.254725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.254742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.254756 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.357238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.357292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.357305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.357324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.357335 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.460256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.460326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.460344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.460370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.460388 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.567553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.567605 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.567618 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.567673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.567686 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.669914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.669951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.669959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.669976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.669985 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.772770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.772814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.772826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.772843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.772855 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.875795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.875851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.875867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.875893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.875915 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.978861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.978926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.978941 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.978961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:10 crc kubenswrapper[4962]: I1003 12:51:10.978974 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:10Z","lastTransitionTime":"2025-10-03T12:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.081300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.081335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.081343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.081358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.081367 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.183939 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.183973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.183982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.183994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.184003 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.226481 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:11 crc kubenswrapper[4962]: E1003 12:51:11.226669 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.285735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.285779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.285791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.285806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.285820 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.388134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.388171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.388184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.388202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.388213 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.489916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.489965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.489979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.490001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.490012 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.592066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.592118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.592132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.592150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.592159 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.694186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.694230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.694239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.694254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.694262 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.796349 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.796380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.796390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.796403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.796412 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.898812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.898852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.898865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.898881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:11 crc kubenswrapper[4962]: I1003 12:51:11.898894 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:11Z","lastTransitionTime":"2025-10-03T12:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.001501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.001537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.001548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.001563 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.001575 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.103599 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.103649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.103660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.103675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.103684 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.205856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.205903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.205916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.205933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.205945 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.226475 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:12 crc kubenswrapper[4962]: E1003 12:51:12.226599 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.226690 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.226734 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:12 crc kubenswrapper[4962]: E1003 12:51:12.226829 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:12 crc kubenswrapper[4962]: E1003 12:51:12.226950 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.239973 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.252488 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a38f90-f921-463b-8484-156bdbff17d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ad2b9fc4c53924d160f9bcbf329977550ed3f9e5724a7930bc19b137f412208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f0e7ddaee0852f8955a31ea974c460c5af2dae4ea15f6fa68d65fe9f0d63e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0f0e7ddaee0852f8955a31ea974c460c5af2dae4ea15f6fa68d65fe9f0d63e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.264042 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.274344 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.284427 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.294487 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.309816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.309859 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.309869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.309886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.309895 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.317838 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"message\\\":\\\"089997 6951 lb_config.go:1031] Cluster endpoints for openshift-kube-storage-version-migrator-operator/metrics for network=default are: map[]\\\\nI1003 12:51:02.088389 6951 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI1003 12:51:02.090054 6951 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 3.971777ms\\\\nI1003 12:51:02.090075 6951 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1003 12:51:02.089757 6951 ovnkube.go:599] Stopped ovnkube\\\\nI1003 12:51:02.089611 6951 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI1003 12:51:02.090155 6951 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 12:51:02.090195 6951 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1003 12:51:02.090220 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.329996 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.342668 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55381b2122f7d63231ef917dc3901e367b66dd7e75eb6fb7c3d049b81113c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"2025-10-03T12:50:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425\\\\n2025-10-03T12:50:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425 to /host/opt/cni/bin/\\\\n2025-10-03T12:50:16Z [verbose] multus-daemon started\\\\n2025-10-03T12:50:16Z [verbose] Readiness Indicator file check\\\\n2025-10-03T12:51:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.354670 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.365838 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.377094 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.390539 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.401718 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.412239 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.412686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.412786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.412889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.413106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.413300 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.431049 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.444609 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.456178 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.471065 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:12Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.515706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.515749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.515762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.515776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.515785 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.618219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.618262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.618274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.618291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.618308 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.720130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.720178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.720195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.720213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.720224 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.823870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.823904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.823914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.823929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.823940 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.926202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.926511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.926537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.926564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:12 crc kubenswrapper[4962]: I1003 12:51:12.926582 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:12Z","lastTransitionTime":"2025-10-03T12:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.028540 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.028578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.028588 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.028601 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.028610 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.131312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.131354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.131378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.131397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.131410 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.226522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:13 crc kubenswrapper[4962]: E1003 12:51:13.226673 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.234518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.234609 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.234628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.234699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.234720 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.337986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.338228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.338291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.338385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.338452 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.441283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.441598 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.441741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.441840 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.441928 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.544769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.545016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.545123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.545409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.545504 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.648260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.648528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.648618 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.648735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.648810 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.751217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.751276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.751290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.751340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.751355 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.853113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.853154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.853165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.853184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.853196 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.955975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.956028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.956055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.956068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:13 crc kubenswrapper[4962]: I1003 12:51:13.956076 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:13Z","lastTransitionTime":"2025-10-03T12:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.058558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.058595 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.058607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.058623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.058665 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.161805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.161857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.161870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.161888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.161900 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.226596 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:14 crc kubenswrapper[4962]: E1003 12:51:14.227007 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.226742 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:14 crc kubenswrapper[4962]: E1003 12:51:14.227194 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.226682 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:14 crc kubenswrapper[4962]: E1003 12:51:14.227386 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.264085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.264143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.264161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.264183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.264200 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.366668 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.366725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.366742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.366765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.366785 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.469769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.469809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.469820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.469835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.469845 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.572458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.572501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.572509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.572522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.572530 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.674656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.674695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.674705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.674720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.674731 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.777065 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.777112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.777123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.777141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.777151 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.879335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.879385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.879400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.879421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.879438 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.981622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.981687 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.981695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.981710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:14 crc kubenswrapper[4962]: I1003 12:51:14.981720 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:14Z","lastTransitionTime":"2025-10-03T12:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.084083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.084127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.084139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.084153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.084162 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.186810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.186848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.186856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.186868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.186878 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.226448 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.226578 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.289699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.289767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.289782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.289801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.289815 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.392597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.392678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.392693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.392712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.392726 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.495321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.495369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.495382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.495400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.495414 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.597854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.597928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.597945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.597975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.597991 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.701422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.701470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.701487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.701508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.701526 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.805541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.805629 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.805672 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.805694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.805710 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.869837 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.870067 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:52:19.870028861 +0000 UTC m=+148.273926696 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.909183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.909261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.909279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.909314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.909338 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:15Z","lastTransitionTime":"2025-10-03T12:51:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.971046 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.971086 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.971103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:15 crc kubenswrapper[4962]: I1003 12:51:15.971133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971231 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971274 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:52:19.971261549 +0000 UTC m=+148.375159384 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971408 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971465 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971556 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971586 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971583 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 12:52:19.971535827 +0000 UTC m=+148.375433742 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971763 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 12:52:19.971723482 +0000 UTC m=+148.375621327 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971478 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971816 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971834 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:51:15 crc kubenswrapper[4962]: E1003 12:51:15.971882 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 12:52:19.971869626 +0000 UTC m=+148.375767691 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.012649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.012695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.012706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.012720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.012886 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.116012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.116061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.116078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.116100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.116116 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.219189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.219239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.219251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.219272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.219285 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.226953 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.226984 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.227017 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:16 crc kubenswrapper[4962]: E1003 12:51:16.227217 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:16 crc kubenswrapper[4962]: E1003 12:51:16.227385 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:16 crc kubenswrapper[4962]: E1003 12:51:16.227776 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.228503 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 12:51:16 crc kubenswrapper[4962]: E1003 12:51:16.228725 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.322324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.322374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.322391 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.322414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.322431 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.424961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.425006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.425022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.425046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.425061 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.528587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.528625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.528658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.528676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.528690 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.631809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.631866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.631877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.631908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.631919 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.734270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.734333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.734344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.734360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.734371 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.838619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.838753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.838774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.838803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.838822 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.941562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.941622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.941631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.941665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:16 crc kubenswrapper[4962]: I1003 12:51:16.941676 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:16Z","lastTransitionTime":"2025-10-03T12:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.044485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.044526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.044537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.044552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.044561 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.147462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.147577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.147588 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.147606 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.147617 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.226747 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:17 crc kubenswrapper[4962]: E1003 12:51:17.226988 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.249749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.249833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.249849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.249864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.249881 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.352628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.352689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.352700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.352717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.352728 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.454732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.454780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.454791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.454807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.454818 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.557495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.557527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.557560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.557575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.557585 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.659403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.659445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.659456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.659472 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.659482 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.761562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.761931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.761941 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.761956 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.761965 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.864096 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.864125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.864133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.864148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.864156 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.966893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.966923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.966932 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.966945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:17 crc kubenswrapper[4962]: I1003 12:51:17.966953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:17Z","lastTransitionTime":"2025-10-03T12:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.070160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.070234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.070260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.070289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.070317 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.172788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.172826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.172833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.172845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.172854 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.226579 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.226579 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.226604 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:18 crc kubenswrapper[4962]: E1003 12:51:18.226864 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:18 crc kubenswrapper[4962]: E1003 12:51:18.226983 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:18 crc kubenswrapper[4962]: E1003 12:51:18.227151 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.275502 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.275560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.275575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.275593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.275603 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.378698 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.378764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.378781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.378797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.378809 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.481253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.481299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.481310 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.481326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.481338 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.584220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.584258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.584268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.584281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.584289 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.686871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.686906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.686914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.686926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.686934 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.789300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.789334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.789612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.792198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.792217 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.894094 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.894140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.894156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.894171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.894181 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.996401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.996442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.996458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.996474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:18 crc kubenswrapper[4962]: I1003 12:51:18.996484 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:18Z","lastTransitionTime":"2025-10-03T12:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.098802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.098850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.098860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.098879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.098891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.200443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.200473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.200482 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.200495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.200504 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.226319 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:19 crc kubenswrapper[4962]: E1003 12:51:19.226500 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.303060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.303120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.303131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.303148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.303159 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.405790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.405837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.405848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.405869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.405881 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.508529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.508589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.508607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.508674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.508692 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.610754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.610790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.610800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.610814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.610824 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.712943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.712997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.713007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.713021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.713032 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.814914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.814952 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.814962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.814979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.814993 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.916959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.916990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.916999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.917012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.917022 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.931197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.931233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.931244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.931262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.931273 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: E1003 12:51:19.950323 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.958593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.958622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.958646 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.958662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.958671 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: E1003 12:51:19.974942 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.978412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.978460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.978469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.978482 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.978491 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:19 crc kubenswrapper[4962]: E1003 12:51:19.990845 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:19Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.994300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.994330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.994340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.994374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:19 crc kubenswrapper[4962]: I1003 12:51:19.994386 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:19Z","lastTransitionTime":"2025-10-03T12:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: E1003 12:51:20.004969 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.008511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.008543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.008551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.008565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.008576 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: E1003 12:51:20.018442 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T12:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f831723-ac1f-49ed-8733-e30832d406d9\\\",\\\"systemUUID\\\":\\\"16e8121c-ac81-46e8-9c72-10e496aaa780\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:20Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:20 crc kubenswrapper[4962]: E1003 12:51:20.018583 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.019777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.019805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.019812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.019822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.019833 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.121653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.121690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.121701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.121717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.121728 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.223672 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.223718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.223726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.223742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.223751 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.226899 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.226981 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.227026 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:20 crc kubenswrapper[4962]: E1003 12:51:20.226999 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:20 crc kubenswrapper[4962]: E1003 12:51:20.227170 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:20 crc kubenswrapper[4962]: E1003 12:51:20.227288 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.325630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.325694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.325705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.325719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.325730 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.428273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.428306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.428319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.428333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.428345 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.531284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.531376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.531393 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.531419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.531437 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.634579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.634619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.634627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.634689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.634702 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.738131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.738435 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.738570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.738800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.739045 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.842274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.842330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.842352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.842379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.842399 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.945621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.945695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.945710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.945727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:20 crc kubenswrapper[4962]: I1003 12:51:20.945739 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:20Z","lastTransitionTime":"2025-10-03T12:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.048025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.048066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.048077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.048092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.048103 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.150964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.151004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.151018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.151035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.151047 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.227083 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:21 crc kubenswrapper[4962]: E1003 12:51:21.227277 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.253989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.254048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.254058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.254074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.254085 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.357230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.357297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.357315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.357341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.357357 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.460469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.460552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.460577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.460602 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.460621 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.562529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.562568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.562577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.562592 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.562601 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.664077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.664115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.664125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.664141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.664152 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.767222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.767288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.767302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.767321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.767334 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.869329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.869386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.869397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.869412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.869421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.972053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.972096 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.972130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.972153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:21 crc kubenswrapper[4962]: I1003 12:51:21.972164 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:21Z","lastTransitionTime":"2025-10-03T12:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.074722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.074757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.074775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.074791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.074801 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.177822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.177875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.177886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.177904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.177915 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.227099 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.227106 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.227125 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:22 crc kubenswrapper[4962]: E1003 12:51:22.227246 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:22 crc kubenswrapper[4962]: E1003 12:51:22.227342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:22 crc kubenswrapper[4962]: E1003 12:51:22.227688 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.238766 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.249003 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53165c5e-33f0-44c9-b4fd-1da1c2b99a4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86846072db8a357f62428197a22e8b8f829b013c9292a53091819ceb6eb7aeb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f016344ea9d266374fcbfbe80061313994a45b595ee3432a95b1bbd332b320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7249baacfeb01d1f25f6d9ce291033941226c4be4cba0376a47872b8b10f6ae6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6e876d0d7a29e1ceb505283a70c9b8fd0b4eee1dec1a2f04f3eeec4b09f4c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.258916 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.269282 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6c913a7a10a94baddfc4f58df9f5bc15cb9e7dd4b91dd313eca9f1a683259b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d69e23034588e75e4716b87b31f129f088940d29983f634a3d1b1e15289d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.279028 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb8d669985493d735cad4d4a09f36401be9ede84ee7e20e82293e7d5331c085e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.280112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.280179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.280196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.280219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.280235 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.296141 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6263076-d49e-4519-b516-d50ec2fb9d88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb41be48a3ebc6ba936c2ec3b2a483349e8d597d0483a618a0d452a2ee13538c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b1520abe4d95d8e776c2f589c8f7ada74c46eef43f1bcaf441272ae75241a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcb83ba4c1a1eb1a63fce31b68210e9af2c5e722097b457b4642b4c32f60c2e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d334e30066203e6856129958923dcb174b5541a9e41e11a4e574ac59ec86c861\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdc43e3dcb77ff3ee44368fa74c53d2f360a508c04525f15879db5c6b2209b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bfcba6f23fa3f07cf6f0e1e0ea41d281121ed2a09133dfbb13cd2a2e84016e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeadd063bcd1e3c3036cb93df691713543384425f4cdfa7716593dd99ad115f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06a03a377d8c2109c3ac03774f7236ac15e099d69ca9530e2c6b853fd28a1114\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.309750 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f337dda-821a-494b-98d6-775f0c1beb7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6ff72af1d85ecca71976d9d16b8ca0ee91d035662920a8c58d739e560c12ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8eb3048afe2307009fd3ccfbc314a1b60389c0ba17bad1b898578cb1b151df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f8877bc7bae269cbbb5a1452d42917b7a32af9bc45a303b25be8b1058ef937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329c763b93db8ac9a4cd93231684840cdfa4c6f9d1318fcfc641f4e86b4fc39e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22239bfd4bc82de5ba41662ff973e8c1e0fa462857cae705d4940afccc550782\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T12:50:06Z\\\",\\\"message\\\":\\\"W1003 12:49:55.901194 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 12:49:55.901561 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759495795 cert, and key in /tmp/serving-cert-3691789359/serving-signer.crt, /tmp/serving-cert-3691789359/serving-signer.key\\\\nI1003 12:49:56.093911 1 observer_polling.go:159] Starting file observer\\\\nW1003 12:49:56.097720 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 12:49:56.097855 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 12:49:56.098660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3691789359/tls.crt::/tmp/serving-cert-3691789359/tls.key\\\\\\\"\\\\nF1003 12:50:06.505868 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dafe1e9291df9776980832fd7d43a1d5e4caf4cd6824740f9e1f2282a6c4bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://655bb4d26e62ec501e3a4be6c70f45df1916a5bcb9ac782fae31417f09b4f01d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.321957 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348e88ca-6e7d-4ab2-b45d-553888f848c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a16c565e235b0d93e373cc23d6d8ee7d889a7caec61ef9ff50c3e904cd3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e84d29d6915b3a6b184d69f18945d3fd277719a6cc6d503b2899df586882a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372bc3d3078a6ab39767510fe9090adb082ed3de331d851831b710df064bef5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e104d4f9cfb2e600dba4f462bdf9ee4aa5453afbb5174857fc61fc85d90f9642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.334126 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-44fmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1a0c0a2-c2cf-4e0a-b82c-5f56c2fccec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fba23cccf497fd12c8f5ff3c93b757e30a8f5aab4ce0deca0f3ec0a49232d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16548acc84770589ac18e77e78dfc8e4f4979f836f77d414551f1e4a98e07fa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f14793519896c14e96e56f5010daa53b894965a2d6b23d6e545f346e0f493ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdff32b133c4770278fcdd9efbb98f50a271836697db5b0736ef5bfd3746189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3237b217cf5233f908d7891c0b6a60aded9d4d89f62b34888415fa2d8c297e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1063aeb80a3a8409bf4c6a261109ca967743e67177940ce04822559bf2d2206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab177ac1755637aed48c3db1f4855ea5420bc320bef2bb96028495b7c565dfd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-44fmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.348726 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e40a27aa-682e-4b25-a198-8054ba9f2477\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa233ac457a03e63e2662227f70bb88c59e3ebecf41454f217278fb34ab8dc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdbr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-46vck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.368884 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90186d9d-0ac4-4959-9fd8-b044098dc6ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"message\\\":\\\"089997 6951 lb_config.go:1031] Cluster endpoints for openshift-kube-storage-version-migrator-operator/metrics for network=default are: map[]\\\\nI1003 12:51:02.088389 6951 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI1003 12:51:02.090054 6951 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 3.971777ms\\\\nI1003 12:51:02.090075 6951 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1003 12:51:02.089757 6951 ovnkube.go:599] Stopped ovnkube\\\\nI1003 12:51:02.089611 6951 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI1003 12:51:02.090155 6951 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 12:51:02.090195 6951 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1003 12:51:02.090220 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdwhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ksp7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.378302 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1198234-8682-43dc-9945-a826eba33888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d869da7b2fe056fe8079fd55c78e1b02c5dbc137172a9c75514882a0a873714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c06137bf819289f2987b5ce188c51c2c5d98a376a0343e2eb121498221019bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6wvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kchhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.382071 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.382238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.382373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.382402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.382416 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.388227 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a38f90-f921-463b-8484-156bdbff17d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ad2b9fc4c53924d160f9bcbf329977550ed3f9e5724a7930bc19b137f412208\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f0e7ddaee0852f8955a31ea974c460c5af2dae4ea15f6fa68d65fe9f0d63e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0f0e7ddaee0852f8955a31ea974c460c5af2dae4ea15f6fa68d65fe9f0d63e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T12:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T12:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:49:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.401482 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.412255 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pnhsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edada57d-7295-4f56-a850-caf58ebe77a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f1a223717c0f2030da538233d9f7b621c4b2e304dcb65316b1aef7741f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ctqjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pnhsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.424314 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g64qv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a46fb9dd-57b2-4300-b9ab-2d40bcc4cd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91103ceb967093cdbedc1aa6b2d68dd2c1cc96a1902cd0bb8f380a3d42e54497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7rc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g64qv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.436070 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82494e0a2ec6e8ba7a2e713a8faf9e20bbacde0875e31f6db9962ece1a1a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:50:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.447234 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sdd6t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbc64268-3e78-44a2-8116-b62b5c13f005\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55381b2122f7d63231ef917dc3901e367b66dd7e75eb6fb7c3d049b81113c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T12:51:01Z\\\",\\\"message\\\":\\\"2025-10-03T12:50:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425\\\\n2025-10-03T12:50:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b16755df-1fce-45b7-a7d8-55a972e98425 to /host/opt/cni/bin/\\\\n2025-10-03T12:50:16Z [verbose] multus-daemon started\\\\n2025-10-03T12:50:16Z [verbose] Readiness Indicator file check\\\\n2025-10-03T12:51:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T12:50:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T12:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktptf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sdd6t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.456439 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5blzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2989e38-d4e7-42c9-8959-f87168a4ac14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T12:50:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrzmw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T12:50:27Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5blzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T12:51:22Z is after 2025-08-24T17:21:41Z" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.484356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.484407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.484429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.484449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.484462 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.586620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.586688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.586697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.586721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.586731 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.689236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.689275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.689286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.689301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.689310 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.792349 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.792380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.792390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.792406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.792415 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.895226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.895275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.895287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.895305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.895319 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.997415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.997460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.997476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.997497 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:22 crc kubenswrapper[4962]: I1003 12:51:22.997514 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:22Z","lastTransitionTime":"2025-10-03T12:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.100028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.100090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.100111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.100139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.100159 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.202879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.202912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.202920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.202933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.202942 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.226365 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:23 crc kubenswrapper[4962]: E1003 12:51:23.226496 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.305535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.305566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.305574 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.305590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.305600 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.407790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.407844 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.407859 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.407877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.407889 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.510538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.510588 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.510601 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.510620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.510631 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.613315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.613363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.613412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.613430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.613441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.715660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.715702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.715716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.715731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.715741 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.817982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.818030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.818044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.818060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.818070 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.921081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.921125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.921137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.921152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:23 crc kubenswrapper[4962]: I1003 12:51:23.921161 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:23Z","lastTransitionTime":"2025-10-03T12:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.024024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.024105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.024118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.024163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.024179 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.127350 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.127415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.127438 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.127461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.127475 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.226866 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.226923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.226966 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:24 crc kubenswrapper[4962]: E1003 12:51:24.227016 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:24 crc kubenswrapper[4962]: E1003 12:51:24.227078 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:24 crc kubenswrapper[4962]: E1003 12:51:24.227195 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.230085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.230137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.230154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.230174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.230190 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.333989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.334064 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.334100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.334128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.334148 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.436756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.436805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.436852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.436876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.436892 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.538966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.539000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.539008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.539022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.539032 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.640724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.640756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.640764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.640777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.640786 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.743661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.743704 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.743715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.743730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.743742 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.846031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.846067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.846079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.846097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.846107 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.948571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.948624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.948662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.948679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:24 crc kubenswrapper[4962]: I1003 12:51:24.948689 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:24Z","lastTransitionTime":"2025-10-03T12:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.051134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.051172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.051183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.051199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.051209 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.153342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.153379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.153391 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.153407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.153417 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.226423 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:25 crc kubenswrapper[4962]: E1003 12:51:25.226541 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.256496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.256539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.256548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.256562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.256570 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.358943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.358979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.358995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.359010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.359022 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.460822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.460861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.460870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.460884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.460894 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.563691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.563729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.563740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.563758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.563770 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.666251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.666302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.666313 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.666333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.666346 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.768423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.768459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.768469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.768483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.768495 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.870898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.871004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.871014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.871029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.871038 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.973341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.973384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.973395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.973414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:25 crc kubenswrapper[4962]: I1003 12:51:25.973426 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:25Z","lastTransitionTime":"2025-10-03T12:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.075330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.075379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.075394 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.075414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.075429 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.176950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.176976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.176985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.176996 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.177004 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.227096 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.227159 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.227153 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:26 crc kubenswrapper[4962]: E1003 12:51:26.227312 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:26 crc kubenswrapper[4962]: E1003 12:51:26.227474 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:26 crc kubenswrapper[4962]: E1003 12:51:26.227864 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.280311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.280389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.280410 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.280440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.280462 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.383278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.383338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.383350 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.383368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.383379 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.486072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.486137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.486160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.486189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.486210 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.594611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.594701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.594716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.594737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.594752 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.698148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.698196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.698208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.698226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.698239 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.800873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.800918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.800932 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.800949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.800962 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.903301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.903336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.903346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.903385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:26 crc kubenswrapper[4962]: I1003 12:51:26.903399 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:26Z","lastTransitionTime":"2025-10-03T12:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.006266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.006313 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.006321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.006338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.006350 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.109150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.109186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.109197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.109215 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.109227 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.211261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.211288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.211296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.211312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.211320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.227142 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:27 crc kubenswrapper[4962]: E1003 12:51:27.227515 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.313879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.313920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.313930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.313946 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.313960 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.416809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.416841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.416852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.416866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.416878 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.519604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.519727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.519751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.519780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.519802 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.622055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.622109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.622136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.622160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.622176 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.724739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.724819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.724839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.724906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.724945 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.827273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.827318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.827333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.827352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.827368 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.929209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.929252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.929276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.929301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:27 crc kubenswrapper[4962]: I1003 12:51:27.929316 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:27Z","lastTransitionTime":"2025-10-03T12:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.031283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.031341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.031359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.031381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.031399 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.134745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.134805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.134822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.134845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.134864 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.227146 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.227208 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:28 crc kubenswrapper[4962]: E1003 12:51:28.227334 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.227354 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:28 crc kubenswrapper[4962]: E1003 12:51:28.227450 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:28 crc kubenswrapper[4962]: E1003 12:51:28.227560 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.236919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.236957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.236966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.236981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.236993 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.339581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.339660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.339670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.339686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.339696 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.442034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.442073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.442083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.442098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.442108 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.544828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.544877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.544887 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.544908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.544919 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.647395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.647448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.647459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.647477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.647489 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.750525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.750580 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.750596 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.750619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.750664 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.853157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.853199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.853209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.853224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.853236 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.955789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.955844 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.955860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.955883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:28 crc kubenswrapper[4962]: I1003 12:51:28.955900 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:28Z","lastTransitionTime":"2025-10-03T12:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.059185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.059258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.059281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.059309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.059912 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.162761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.162834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.162857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.162884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.162903 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.226852 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:29 crc kubenswrapper[4962]: E1003 12:51:29.227048 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.227792 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 12:51:29 crc kubenswrapper[4962]: E1003 12:51:29.227995 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ksp7d_openshift-ovn-kubernetes(90186d9d-0ac4-4959-9fd8-b044098dc6ae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.265777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.265827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.265841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.265863 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.265876 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.367624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.367693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.367705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.367721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.367731 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.470184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.470222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.470232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.470247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.470256 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.572661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.572702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.572710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.572724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.572735 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.674370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.674414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.674427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.674445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.674457 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.777592 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.777714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.777731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.777747 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.777761 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.880345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.880597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.880678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.880741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.880816 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.984085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.984339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.984404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.984501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:29 crc kubenswrapper[4962]: I1003 12:51:29.984560 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:29Z","lastTransitionTime":"2025-10-03T12:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.087345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.087378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.087388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.087402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.087411 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:30Z","lastTransitionTime":"2025-10-03T12:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.150220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.150285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.150298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.150316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.150327 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T12:51:30Z","lastTransitionTime":"2025-10-03T12:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.193179 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd"] Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.193580 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.195574 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.195934 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.196259 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.197001 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.226569 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:30 crc kubenswrapper[4962]: E1003 12:51:30.226940 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.227202 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:30 crc kubenswrapper[4962]: E1003 12:51:30.227335 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.227697 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:30 crc kubenswrapper[4962]: E1003 12:51:30.227811 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.233357 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.233339475 podStartE2EDuration="1m13.233339475s" podCreationTimestamp="2025-10-03 12:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.233252253 +0000 UTC m=+98.637150108" watchObservedRunningTime="2025-10-03 12:51:30.233339475 +0000 UTC m=+98.637237310" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.298332 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.298310129 podStartE2EDuration="1m18.298310129s" podCreationTimestamp="2025-10-03 12:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.295008848 +0000 UTC m=+98.698906703" watchObservedRunningTime="2025-10-03 12:51:30.298310129 +0000 UTC m=+98.702207964" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.308565 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.3085461 podStartE2EDuration="1m18.3085461s" podCreationTimestamp="2025-10-03 12:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.308431206 +0000 UTC m=+98.712329041" watchObservedRunningTime="2025-10-03 12:51:30.3085461 +0000 UTC m=+98.712443935" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.317860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e31ea8af-5e14-4310-be97-c340f3715511-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.317909 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e31ea8af-5e14-4310-be97-c340f3715511-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.317934 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e31ea8af-5e14-4310-be97-c340f3715511-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.317957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e31ea8af-5e14-4310-be97-c340f3715511-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.318101 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e31ea8af-5e14-4310-be97-c340f3715511-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.322753 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.322742449 podStartE2EDuration="44.322742449s" podCreationTimestamp="2025-10-03 12:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.322160253 +0000 UTC m=+98.726058088" watchObservedRunningTime="2025-10-03 12:51:30.322742449 +0000 UTC m=+98.726640284" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.352093 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-44fmz" podStartSLOduration=77.352075734 podStartE2EDuration="1m17.352075734s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.335460918 +0000 UTC m=+98.739358763" watchObservedRunningTime="2025-10-03 12:51:30.352075734 +0000 UTC m=+98.755973569" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.352306 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podStartSLOduration=77.35230302 podStartE2EDuration="1m17.35230302s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.352018453 +0000 UTC m=+98.755916298" watchObservedRunningTime="2025-10-03 12:51:30.35230302 +0000 UTC m=+98.756200855" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.382058 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kchhs" podStartSLOduration=76.382040587 podStartE2EDuration="1m16.382040587s" podCreationTimestamp="2025-10-03 12:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.382034266 +0000 UTC m=+98.785932101" watchObservedRunningTime="2025-10-03 12:51:30.382040587 +0000 UTC m=+98.785938422" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.407235 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=27.407218068 podStartE2EDuration="27.407218068s" podCreationTimestamp="2025-10-03 12:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.396100532 +0000 UTC m=+98.799998387" watchObservedRunningTime="2025-10-03 12:51:30.407218068 +0000 UTC m=+98.811115903" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.416626 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pnhsr" podStartSLOduration=77.416611245 podStartE2EDuration="1m17.416611245s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.416012839 +0000 UTC m=+98.819910694" watchObservedRunningTime="2025-10-03 12:51:30.416611245 +0000 UTC m=+98.820509080" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.419457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e31ea8af-5e14-4310-be97-c340f3715511-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.419707 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e31ea8af-5e14-4310-be97-c340f3715511-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.419916 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e31ea8af-5e14-4310-be97-c340f3715511-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.420033 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e31ea8af-5e14-4310-be97-c340f3715511-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.421030 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e31ea8af-5e14-4310-be97-c340f3715511-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.419844 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e31ea8af-5e14-4310-be97-c340f3715511-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.419592 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e31ea8af-5e14-4310-be97-c340f3715511-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.422210 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e31ea8af-5e14-4310-be97-c340f3715511-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.428357 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e31ea8af-5e14-4310-be97-c340f3715511-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.428969 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g64qv" podStartSLOduration=77.428956874 podStartE2EDuration="1m17.428956874s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.428173743 +0000 UTC m=+98.832071618" watchObservedRunningTime="2025-10-03 12:51:30.428956874 +0000 UTC m=+98.832854709" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.435747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e31ea8af-5e14-4310-be97-c340f3715511-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4dljd\" (UID: \"e31ea8af-5e14-4310-be97-c340f3715511\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.454971 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sdd6t" podStartSLOduration=77.454953058 podStartE2EDuration="1m17.454953058s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.45432343 +0000 UTC m=+98.858221265" watchObservedRunningTime="2025-10-03 12:51:30.454953058 +0000 UTC m=+98.858850893" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.508274 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.673871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" event={"ID":"e31ea8af-5e14-4310-be97-c340f3715511","Type":"ContainerStarted","Data":"3940b78745010bc962535884990726b8a910e8acf2eeceeb6ed901e74a0d2fa6"} Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.674224 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" event={"ID":"e31ea8af-5e14-4310-be97-c340f3715511","Type":"ContainerStarted","Data":"725889fa0ac01cc3a7b653353ab03ebe19e8b684b51a7bf28a168f14eb8109f3"} Oct 03 12:51:30 crc kubenswrapper[4962]: I1003 12:51:30.690293 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4dljd" podStartSLOduration=77.690273306 podStartE2EDuration="1m17.690273306s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:30.688760815 +0000 UTC m=+99.092658660" watchObservedRunningTime="2025-10-03 12:51:30.690273306 +0000 UTC m=+99.094171141" Oct 03 12:51:31 crc kubenswrapper[4962]: I1003 12:51:31.227060 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:31 crc kubenswrapper[4962]: E1003 12:51:31.227318 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:31 crc kubenswrapper[4962]: I1003 12:51:31.431474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:31 crc kubenswrapper[4962]: E1003 12:51:31.431667 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:51:31 crc kubenswrapper[4962]: E1003 12:51:31.431726 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs podName:f2989e38-d4e7-42c9-8959-f87168a4ac14 nodeName:}" failed. No retries permitted until 2025-10-03 12:52:35.431709965 +0000 UTC m=+163.835607800 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs") pod "network-metrics-daemon-5blzz" (UID: "f2989e38-d4e7-42c9-8959-f87168a4ac14") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 12:51:32 crc kubenswrapper[4962]: I1003 12:51:32.227013 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:32 crc kubenswrapper[4962]: I1003 12:51:32.227161 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:32 crc kubenswrapper[4962]: E1003 12:51:32.228165 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:32 crc kubenswrapper[4962]: I1003 12:51:32.228231 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:32 crc kubenswrapper[4962]: E1003 12:51:32.228429 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:32 crc kubenswrapper[4962]: E1003 12:51:32.228499 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:33 crc kubenswrapper[4962]: I1003 12:51:33.226733 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:33 crc kubenswrapper[4962]: E1003 12:51:33.226870 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:34 crc kubenswrapper[4962]: I1003 12:51:34.227194 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:34 crc kubenswrapper[4962]: I1003 12:51:34.227346 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:34 crc kubenswrapper[4962]: I1003 12:51:34.227271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:34 crc kubenswrapper[4962]: E1003 12:51:34.227475 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:34 crc kubenswrapper[4962]: E1003 12:51:34.227735 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:34 crc kubenswrapper[4962]: E1003 12:51:34.227770 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:35 crc kubenswrapper[4962]: I1003 12:51:35.226262 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:35 crc kubenswrapper[4962]: E1003 12:51:35.226410 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:36 crc kubenswrapper[4962]: I1003 12:51:36.226627 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:36 crc kubenswrapper[4962]: I1003 12:51:36.226715 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:36 crc kubenswrapper[4962]: I1003 12:51:36.226729 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:36 crc kubenswrapper[4962]: E1003 12:51:36.226807 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:36 crc kubenswrapper[4962]: E1003 12:51:36.226935 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:36 crc kubenswrapper[4962]: E1003 12:51:36.227104 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:37 crc kubenswrapper[4962]: I1003 12:51:37.227279 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:37 crc kubenswrapper[4962]: E1003 12:51:37.227526 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:38 crc kubenswrapper[4962]: I1003 12:51:38.226453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:38 crc kubenswrapper[4962]: I1003 12:51:38.226594 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:38 crc kubenswrapper[4962]: E1003 12:51:38.226712 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:38 crc kubenswrapper[4962]: E1003 12:51:38.226809 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:38 crc kubenswrapper[4962]: I1003 12:51:38.226600 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:38 crc kubenswrapper[4962]: E1003 12:51:38.226881 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:39 crc kubenswrapper[4962]: I1003 12:51:39.226760 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:39 crc kubenswrapper[4962]: E1003 12:51:39.227181 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:40 crc kubenswrapper[4962]: I1003 12:51:40.226575 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:40 crc kubenswrapper[4962]: I1003 12:51:40.226623 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:40 crc kubenswrapper[4962]: E1003 12:51:40.226773 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:40 crc kubenswrapper[4962]: E1003 12:51:40.226866 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:40 crc kubenswrapper[4962]: I1003 12:51:40.226608 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:40 crc kubenswrapper[4962]: E1003 12:51:40.227764 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:41 crc kubenswrapper[4962]: I1003 12:51:41.227117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:41 crc kubenswrapper[4962]: E1003 12:51:41.227288 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:42 crc kubenswrapper[4962]: I1003 12:51:42.226965 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:42 crc kubenswrapper[4962]: I1003 12:51:42.227047 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:42 crc kubenswrapper[4962]: E1003 12:51:42.227971 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:42 crc kubenswrapper[4962]: I1003 12:51:42.227991 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:42 crc kubenswrapper[4962]: E1003 12:51:42.228115 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:42 crc kubenswrapper[4962]: E1003 12:51:42.228205 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:43 crc kubenswrapper[4962]: I1003 12:51:43.226727 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:43 crc kubenswrapper[4962]: E1003 12:51:43.226835 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:43 crc kubenswrapper[4962]: I1003 12:51:43.227336 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 12:51:43 crc kubenswrapper[4962]: I1003 12:51:43.712323 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/3.log" Oct 03 12:51:43 crc kubenswrapper[4962]: I1003 12:51:43.714974 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerStarted","Data":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} Oct 03 12:51:43 crc kubenswrapper[4962]: I1003 12:51:43.715325 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:51:43 crc kubenswrapper[4962]: I1003 12:51:43.743832 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podStartSLOduration=90.743797109 podStartE2EDuration="1m30.743797109s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:43.741462085 +0000 UTC m=+112.145359930" watchObservedRunningTime="2025-10-03 12:51:43.743797109 +0000 UTC m=+112.147694944" Oct 03 12:51:44 crc kubenswrapper[4962]: I1003 12:51:44.070540 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5blzz"] Oct 03 12:51:44 crc kubenswrapper[4962]: I1003 12:51:44.070645 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:44 crc kubenswrapper[4962]: E1003 12:51:44.070739 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:44 crc kubenswrapper[4962]: I1003 12:51:44.226861 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:44 crc kubenswrapper[4962]: I1003 12:51:44.226863 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:44 crc kubenswrapper[4962]: E1003 12:51:44.227040 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:44 crc kubenswrapper[4962]: I1003 12:51:44.226863 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:44 crc kubenswrapper[4962]: E1003 12:51:44.227676 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:44 crc kubenswrapper[4962]: E1003 12:51:44.227113 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.226505 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.226532 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:46 crc kubenswrapper[4962]: E1003 12:51:46.226654 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.226671 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.226608 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:46 crc kubenswrapper[4962]: E1003 12:51:46.226814 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 12:51:46 crc kubenswrapper[4962]: E1003 12:51:46.226732 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5blzz" podUID="f2989e38-d4e7-42c9-8959-f87168a4ac14" Oct 03 12:51:46 crc kubenswrapper[4962]: E1003 12:51:46.226895 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.398983 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.399233 4962 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.433296 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.433662 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.437708 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5zs6v"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.438367 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5zwj"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.438731 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.439593 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.441700 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.452819 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s2sdt"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.452929 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.452929 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.468289 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.468412 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.468821 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.468940 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.468881 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.468956 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.469134 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.468835 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.469427 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.469333 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.469790 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.469298 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.468829 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470281 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470360 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470435 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470491 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470577 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470691 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470844 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-image-import-ca\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470888 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwt7n\" (UniqueName: \"kubernetes.io/projected/bdd54121-ea41-4870-a612-28c7cdc242dd-kube-api-access-zwt7n\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470891 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-config\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.470980 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-audit\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471003 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtzcf\" (UniqueName: \"kubernetes.io/projected/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-kube-api-access-gtzcf\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471026 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b83dbab-ab28-4769-b812-be82be9db67e-node-pullsecrets\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b83dbab-ab28-4769-b812-be82be9db67e-audit-dir\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471070 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471092 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-config\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471114 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471160 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msqnj\" (UniqueName: \"kubernetes.io/projected/7b83dbab-ab28-4769-b812-be82be9db67e-kube-api-access-msqnj\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471186 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdd54121-ea41-4870-a612-28c7cdc242dd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471214 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471237 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201d1b81-b0bd-4584-9406-77ac0888ae49-config\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471328 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-serving-cert\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471377 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-serving-cert\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdd54121-ea41-4870-a612-28c7cdc242dd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471563 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-encryption-config\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471607 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/201d1b81-b0bd-4584-9406-77ac0888ae49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471654 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-client-ca\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/201d1b81-b0bd-4584-9406-77ac0888ae49-images\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471748 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-etcd-client\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471776 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s7p4\" (UniqueName: \"kubernetes.io/projected/201d1b81-b0bd-4584-9406-77ac0888ae49-kube-api-access-5s7p4\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.471870 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.472006 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.472748 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.473577 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.473791 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.473811 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.473845 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.475681 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q777j"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.476176 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wgl5v"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.476507 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.476932 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.479768 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4z5v6"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.480433 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.483732 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5zwj"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.484655 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-t4sln"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.485167 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.488851 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.488913 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.488976 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489054 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.488849 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489060 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489197 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489275 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489285 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489347 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489371 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489419 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489474 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489489 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489494 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489547 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489557 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489612 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489667 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489672 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489710 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489787 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489804 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489862 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489907 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.489970 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.490015 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.490059 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.490086 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.490151 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.491228 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.491599 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.491832 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.491923 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.491996 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492008 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492079 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492129 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492169 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492263 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492300 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492455 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492525 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492142 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492676 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492267 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492751 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492650 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492797 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492921 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.493229 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.492749 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.494280 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.494376 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.495204 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.495678 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.507736 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.510241 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.515317 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.515997 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.518722 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.519015 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.519244 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.525850 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.525957 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.526888 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd8km"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.545121 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hszb5"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.545463 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wlnbb"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.545786 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6pkx8"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.545816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.546088 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.546174 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.546147 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.546410 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.546594 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.548137 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.548816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.549095 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gtf6m"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.549734 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gtf6m" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.550434 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.550696 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5zs6v"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.551473 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.552429 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.553616 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.555988 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.556136 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.556270 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.556403 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.556606 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hcwtn"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.557213 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.559723 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.560488 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s2sdt"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.562158 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.563722 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.564568 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.565507 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.565609 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.566097 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.566841 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.566871 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.567003 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.567058 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.568610 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.569685 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.570193 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.571109 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.571504 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572561 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572767 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdd54121-ea41-4870-a612-28c7cdc242dd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572810 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572831 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/994f46cf-ed06-420d-a2fd-52547aadd0ce-audit-dir\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572853 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572884 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-oauth-serving-cert\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572904 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-trusted-ca-bundle\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572926 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-encryption-config\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-service-ca\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572971 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtj4\" (UniqueName: \"kubernetes.io/projected/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-kube-api-access-qhtj4\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.572997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/201d1b81-b0bd-4584-9406-77ac0888ae49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573018 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-client-ca\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573039 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-serving-cert\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573063 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-config\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573086 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/201d1b81-b0bd-4584-9406-77ac0888ae49-images\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573105 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-oauth-config\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573143 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-config\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573167 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573189 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-etcd-client\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-serving-cert\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573231 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/003637f9-ebbe-4587-bdf0-071bfca642dd-metrics-tls\") pod \"dns-operator-744455d44c-hcwtn\" (UID: \"003637f9-ebbe-4587-bdf0-071bfca642dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573251 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5a2d1b-1d06-4201-8546-1a6f67a2511e-serving-cert\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573274 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s7p4\" (UniqueName: \"kubernetes.io/projected/201d1b81-b0bd-4584-9406-77ac0888ae49-kube-api-access-5s7p4\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573293 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-service-ca\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573318 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwt7n\" (UniqueName: \"kubernetes.io/projected/bdd54121-ea41-4870-a612-28c7cdc242dd-kube-api-access-zwt7n\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573339 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-config\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573363 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-image-import-ca\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573387 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf3d2489-9ec1-4479-9f8c-a519812581f8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573409 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-audit\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573431 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb48dfa2-82c3-4c23-b66d-77ac0166326f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573456 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtzcf\" (UniqueName: \"kubernetes.io/projected/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-kube-api-access-gtzcf\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mskz\" (UniqueName: \"kubernetes.io/projected/796d19ea-1d92-4dcb-9e10-305ddbe1b283-kube-api-access-7mskz\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573503 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttdz7\" (UniqueName: \"kubernetes.io/projected/003637f9-ebbe-4587-bdf0-071bfca642dd-kube-api-access-ttdz7\") pod \"dns-operator-744455d44c-hcwtn\" (UID: \"003637f9-ebbe-4587-bdf0-071bfca642dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573524 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb48dfa2-82c3-4c23-b66d-77ac0166326f-config\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573543 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf3d2489-9ec1-4479-9f8c-a519812581f8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573568 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-encryption-config\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573590 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf3d2489-9ec1-4479-9f8c-a519812581f8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573614 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b83dbab-ab28-4769-b812-be82be9db67e-node-pullsecrets\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b83dbab-ab28-4769-b812-be82be9db67e-audit-dir\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573676 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-serving-cert\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573698 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqgg\" (UniqueName: \"kubernetes.io/projected/df5a2d1b-1d06-4201-8546-1a6f67a2511e-kube-api-access-kmqgg\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573725 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g5qg\" (UniqueName: \"kubernetes.io/projected/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-kube-api-access-2g5qg\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573757 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573785 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573816 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-client\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-config\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573920 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2bc7\" (UniqueName: \"kubernetes.io/projected/a5971c52-f20d-40a2-9e80-e1c02e83cec0-kube-api-access-h2bc7\") pod \"downloads-7954f5f757-gtf6m\" (UID: \"a5971c52-f20d-40a2-9e80-e1c02e83cec0\") " pod="openshift-console/downloads-7954f5f757-gtf6m" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573964 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-ca\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.573998 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb48dfa2-82c3-4c23-b66d-77ac0166326f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574029 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psfsr\" (UniqueName: \"kubernetes.io/projected/994f46cf-ed06-420d-a2fd-52547aadd0ce-kube-api-access-psfsr\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574061 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msqnj\" (UniqueName: \"kubernetes.io/projected/7b83dbab-ab28-4769-b812-be82be9db67e-kube-api-access-msqnj\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdd54121-ea41-4870-a612-28c7cdc242dd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574125 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-audit-policies\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574152 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574187 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574218 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201d1b81-b0bd-4584-9406-77ac0888ae49-config\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574240 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-serving-cert\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-serving-cert\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.574308 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-etcd-client\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.575285 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-client-ca\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.575349 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4z5v6"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.575377 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.575439 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b83dbab-ab28-4769-b812-be82be9db67e-audit-dir\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.575525 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.576359 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-config\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.576569 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q777j"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.577251 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-image-import-ca\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.577527 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-audit\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.577811 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b83dbab-ab28-4769-b812-be82be9db67e-node-pullsecrets\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.578298 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.589101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.584044 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdd54121-ea41-4870-a612-28c7cdc242dd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.587717 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/201d1b81-b0bd-4584-9406-77ac0888ae49-images\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.590135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-config\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.590650 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-encryption-config\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.590731 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/201d1b81-b0bd-4584-9406-77ac0888ae49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.590875 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdd54121-ea41-4870-a612-28c7cdc242dd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.590962 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-etcd-client\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.591770 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.592080 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.592758 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.593524 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201d1b81-b0bd-4584-9406-77ac0888ae49-config\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.593802 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jfpvr"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.595216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-serving-cert\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.595872 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.596326 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.597209 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.597497 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.598146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b83dbab-ab28-4769-b812-be82be9db67e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.611955 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.612023 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.612554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b83dbab-ab28-4769-b812-be82be9db67e-serving-cert\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.614936 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.615033 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.616440 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.616963 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.617322 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.617616 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.621574 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.622207 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdw2z"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.622941 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.628869 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.630014 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.630244 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.631822 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.633030 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.637279 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.637484 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.639353 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vnd7f"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.640324 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r2b97"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.642234 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.640599 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.641893 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.639547 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.642367 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2b97" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.644312 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gtf6m"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.644548 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hszb5"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.644655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.644522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.645131 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.646235 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wgl5v"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.648335 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.652704 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.655908 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hcwtn"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.656217 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.657172 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.658260 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.658954 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.659555 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd8km"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.660483 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-t4sln"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.661382 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.662470 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.663840 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6pkx8"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.666011 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.667239 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.669677 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vnd7f"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.671197 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r2b97"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.673507 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.674860 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.674981 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5a2d1b-1d06-4201-8546-1a6f67a2511e-serving-cert\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675009 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-serving-cert\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675028 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/003637f9-ebbe-4587-bdf0-071bfca642dd-metrics-tls\") pod \"dns-operator-744455d44c-hcwtn\" (UID: \"003637f9-ebbe-4587-bdf0-071bfca642dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675049 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-service-ca\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675064 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf3d2489-9ec1-4479-9f8c-a519812581f8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb48dfa2-82c3-4c23-b66d-77ac0166326f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675101 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb48dfa2-82c3-4c23-b66d-77ac0166326f-config\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675115 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf3d2489-9ec1-4479-9f8c-a519812581f8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mskz\" (UniqueName: \"kubernetes.io/projected/796d19ea-1d92-4dcb-9e10-305ddbe1b283-kube-api-access-7mskz\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttdz7\" (UniqueName: \"kubernetes.io/projected/003637f9-ebbe-4587-bdf0-071bfca642dd-kube-api-access-ttdz7\") pod \"dns-operator-744455d44c-hcwtn\" (UID: \"003637f9-ebbe-4587-bdf0-071bfca642dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-encryption-config\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675184 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf3d2489-9ec1-4479-9f8c-a519812581f8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675201 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqgg\" (UniqueName: \"kubernetes.io/projected/df5a2d1b-1d06-4201-8546-1a6f67a2511e-kube-api-access-kmqgg\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675217 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-serving-cert\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675233 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g5qg\" (UniqueName: \"kubernetes.io/projected/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-kube-api-access-2g5qg\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-client\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675281 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2bc7\" (UniqueName: \"kubernetes.io/projected/a5971c52-f20d-40a2-9e80-e1c02e83cec0-kube-api-access-h2bc7\") pod \"downloads-7954f5f757-gtf6m\" (UID: \"a5971c52-f20d-40a2-9e80-e1c02e83cec0\") " pod="openshift-console/downloads-7954f5f757-gtf6m" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675305 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-ca\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675327 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb48dfa2-82c3-4c23-b66d-77ac0166326f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675344 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psfsr\" (UniqueName: \"kubernetes.io/projected/994f46cf-ed06-420d-a2fd-52547aadd0ce-kube-api-access-psfsr\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675365 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-audit-policies\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675383 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675402 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-etcd-client\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/994f46cf-ed06-420d-a2fd-52547aadd0ce-audit-dir\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675481 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-oauth-serving-cert\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675536 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-trusted-ca-bundle\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675553 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhtj4\" (UniqueName: \"kubernetes.io/projected/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-kube-api-access-qhtj4\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675569 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-service-ca\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675585 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-config\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675602 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-serving-cert\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-oauth-config\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-config\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675678 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675761 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.675926 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.676993 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.677100 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-config\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.677118 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-service-ca\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.677386 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5hcj2"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.677672 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-trusted-ca-bundle\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.677888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.677972 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-oauth-serving-cert\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.678047 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.678068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/994f46cf-ed06-420d-a2fd-52547aadd0ce-audit-policies\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.678333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/994f46cf-ed06-420d-a2fd-52547aadd0ce-audit-dir\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.679094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-config\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.679362 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.679808 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-oauth-config\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.680207 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-serving-cert\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.680465 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.682033 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-etcd-client\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.682094 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v8m25"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.683281 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.684678 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.686278 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.687025 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/994f46cf-ed06-420d-a2fd-52547aadd0ce-encryption-config\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.688330 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdw2z"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.688394 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-serving-cert\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.688456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-serving-cert\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.688903 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.690152 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.690981 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v8m25"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.692520 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jfpvr"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.694498 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.695655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.696745 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5hcj2"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.697313 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.697822 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-w5ndz"] Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.698358 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.715481 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.736973 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.757079 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.776723 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.788063 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5a2d1b-1d06-4201-8546-1a6f67a2511e-serving-cert\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.796467 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.801739 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-client\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.817387 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.818891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-ca\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.836242 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.836842 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df5a2d1b-1d06-4201-8546-1a6f67a2511e-etcd-service-ca\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.855927 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.875001 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.895602 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.915650 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.936593 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.956397 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.976030 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 12:51:46 crc kubenswrapper[4962]: I1003 12:51:46.995442 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.015825 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.040877 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.055874 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.075180 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.096186 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.115687 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.156324 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.175778 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.196485 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.216739 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.229910 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/003637f9-ebbe-4587-bdf0-071bfca642dd-metrics-tls\") pod \"dns-operator-744455d44c-hcwtn\" (UID: \"003637f9-ebbe-4587-bdf0-071bfca642dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.236255 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.257160 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.275876 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.280126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb48dfa2-82c3-4c23-b66d-77ac0166326f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.296605 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.297206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb48dfa2-82c3-4c23-b66d-77ac0166326f-config\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.315655 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.335954 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.344247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf3d2489-9ec1-4479-9f8c-a519812581f8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.376120 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.377342 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf3d2489-9ec1-4479-9f8c-a519812581f8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.395857 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.429547 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s7p4\" (UniqueName: \"kubernetes.io/projected/201d1b81-b0bd-4584-9406-77ac0888ae49-kube-api-access-5s7p4\") pod \"machine-api-operator-5694c8668f-5zs6v\" (UID: \"201d1b81-b0bd-4584-9406-77ac0888ae49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.449951 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwt7n\" (UniqueName: \"kubernetes.io/projected/bdd54121-ea41-4870-a612-28c7cdc242dd-kube-api-access-zwt7n\") pod \"openshift-apiserver-operator-796bbdcf4f-pkncv\" (UID: \"bdd54121-ea41-4870-a612-28c7cdc242dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.468283 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtzcf\" (UniqueName: \"kubernetes.io/projected/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-kube-api-access-gtzcf\") pod \"controller-manager-879f6c89f-p5zwj\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.489810 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msqnj\" (UniqueName: \"kubernetes.io/projected/7b83dbab-ab28-4769-b812-be82be9db67e-kube-api-access-msqnj\") pod \"apiserver-76f77b778f-s2sdt\" (UID: \"7b83dbab-ab28-4769-b812-be82be9db67e\") " pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.492718 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.495916 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.516100 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.522233 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.535912 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.556261 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.577056 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.596698 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.614941 4962 request.go:700] Waited for 1.01801003s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dservice-ca-dockercfg-pn86c&limit=500&resourceVersion=0 Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.617022 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.635806 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.655088 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.669768 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5zs6v"] Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.676247 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.686818 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s2sdt"] Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.695424 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.709587 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.716118 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.727906 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" event={"ID":"201d1b81-b0bd-4584-9406-77ac0888ae49","Type":"ContainerStarted","Data":"fd19bde8db8596566a3772deb53a0ddb1c267dbe0da4e444c7a53aec7f65e1a1"} Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.728901 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" event={"ID":"7b83dbab-ab28-4769-b812-be82be9db67e","Type":"ContainerStarted","Data":"39acee0ccb4a0ea1c40fe73caa2316cef6a891e2e7e028a88d24edb8eb5e94bc"} Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.736755 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.756244 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.764304 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.775594 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.796532 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.816755 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.837563 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.855448 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.878521 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv"] Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.878717 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 12:51:47 crc kubenswrapper[4962]: W1003 12:51:47.891065 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdd54121_ea41_4870_a612_28c7cdc242dd.slice/crio-4bd994ada6c3ab94d07cc6478e39089bb22476c87edda9cfd063527512c6a276 WatchSource:0}: Error finding container 4bd994ada6c3ab94d07cc6478e39089bb22476c87edda9cfd063527512c6a276: Status 404 returned error can't find the container with id 4bd994ada6c3ab94d07cc6478e39089bb22476c87edda9cfd063527512c6a276 Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.895899 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.918433 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.935194 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.938569 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5zwj"] Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.962782 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.977091 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 12:51:47 crc kubenswrapper[4962]: I1003 12:51:47.997021 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 12:51:48 crc kubenswrapper[4962]: W1003 12:51:48.003180 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c7cf52e_ea38_43a7_bd33_f546b4d5f57c.slice/crio-4fa2507ffc77520ae604b1481020b41432a465d5250ae855a18b722d4478f3b7 WatchSource:0}: Error finding container 4fa2507ffc77520ae604b1481020b41432a465d5250ae855a18b722d4478f3b7: Status 404 returned error can't find the container with id 4fa2507ffc77520ae604b1481020b41432a465d5250ae855a18b722d4478f3b7 Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.015666 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.036738 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.056021 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.076436 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.095234 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.116427 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.135893 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.156887 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.176324 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.196440 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.215451 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.226952 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.227004 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.227193 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.227554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.236177 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.255513 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.275791 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.295941 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.316041 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.336658 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.356456 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.375870 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.395968 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.431188 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2bc7\" (UniqueName: \"kubernetes.io/projected/a5971c52-f20d-40a2-9e80-e1c02e83cec0-kube-api-access-h2bc7\") pod \"downloads-7954f5f757-gtf6m\" (UID: \"a5971c52-f20d-40a2-9e80-e1c02e83cec0\") " pod="openshift-console/downloads-7954f5f757-gtf6m" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.449363 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttdz7\" (UniqueName: \"kubernetes.io/projected/003637f9-ebbe-4587-bdf0-071bfca642dd-kube-api-access-ttdz7\") pod \"dns-operator-744455d44c-hcwtn\" (UID: \"003637f9-ebbe-4587-bdf0-071bfca642dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.469377 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mskz\" (UniqueName: \"kubernetes.io/projected/796d19ea-1d92-4dcb-9e10-305ddbe1b283-kube-api-access-7mskz\") pod \"console-f9d7485db-wgl5v\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.489230 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf3d2489-9ec1-4479-9f8c-a519812581f8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ndncb\" (UID: \"bf3d2489-9ec1-4479-9f8c-a519812581f8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.509443 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhtj4\" (UniqueName: \"kubernetes.io/projected/d6b25ec8-0fa2-4d8e-81e3-51e15eee578a-kube-api-access-qhtj4\") pod \"openshift-config-operator-7777fb866f-t4sln\" (UID: \"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.529284 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb48dfa2-82c3-4c23-b66d-77ac0166326f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnnwg\" (UID: \"bb48dfa2-82c3-4c23-b66d-77ac0166326f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.549282 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psfsr\" (UniqueName: \"kubernetes.io/projected/994f46cf-ed06-420d-a2fd-52547aadd0ce-kube-api-access-psfsr\") pod \"apiserver-7bbb656c7d-xrdnq\" (UID: \"994f46cf-ed06-420d-a2fd-52547aadd0ce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.555384 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.566037 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gtf6m" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.575493 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.575949 4962 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.581690 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.589211 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.596938 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.615475 4962 request.go:700] Waited for 1.935017896s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.636490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g5qg\" (UniqueName: \"kubernetes.io/projected/693d9472-5b36-4df8-a6fb-d3e3aece0cdb-kube-api-access-2g5qg\") pod \"openshift-controller-manager-operator-756b6f6bc6-x4nlh\" (UID: \"693d9472-5b36-4df8-a6fb-d3e3aece0cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.656582 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.657300 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqgg\" (UniqueName: \"kubernetes.io/projected/df5a2d1b-1d06-4201-8546-1a6f67a2511e-kube-api-access-kmqgg\") pod \"etcd-operator-b45778765-sd8km\" (UID: \"df5a2d1b-1d06-4201-8546-1a6f67a2511e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.678791 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.699822 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.716118 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.735549 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" event={"ID":"201d1b81-b0bd-4584-9406-77ac0888ae49","Type":"ContainerStarted","Data":"37ca12b3c565311c0ace90b31128dbb5a21fcc5b0e1c8cb53435cf7544e2d9f0"} Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.735598 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" event={"ID":"201d1b81-b0bd-4584-9406-77ac0888ae49","Type":"ContainerStarted","Data":"534eb4c604688680ea9cde2c9b1bd1022b2a49c3c298d20d2ce69d0cedf75c71"} Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.736433 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.736446 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.737707 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" event={"ID":"bdd54121-ea41-4870-a612-28c7cdc242dd","Type":"ContainerStarted","Data":"230cee14c87badac4942d8d2531da9703fd9e7a90de89a30bed3ae5a2c785f5b"} Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.737766 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" event={"ID":"bdd54121-ea41-4870-a612-28c7cdc242dd","Type":"ContainerStarted","Data":"4bd994ada6c3ab94d07cc6478e39089bb22476c87edda9cfd063527512c6a276"} Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.745157 4962 generic.go:334] "Generic (PLEG): container finished" podID="7b83dbab-ab28-4769-b812-be82be9db67e" containerID="995d3819924bd074fbfcb86d3839ca3d3db84c8232716a099b354792466dab63" exitCode=0 Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.745250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" event={"ID":"7b83dbab-ab28-4769-b812-be82be9db67e","Type":"ContainerDied","Data":"995d3819924bd074fbfcb86d3839ca3d3db84c8232716a099b354792466dab63"} Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.747532 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" event={"ID":"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c","Type":"ContainerStarted","Data":"d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47"} Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.747582 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" event={"ID":"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c","Type":"ContainerStarted","Data":"4fa2507ffc77520ae604b1481020b41432a465d5250ae855a18b722d4478f3b7"} Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.747829 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.750119 4962 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-p5zwj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.750174 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" podUID="9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.756174 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.767508 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.795094 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gtf6m"] Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801014 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-serving-cert\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801123 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801178 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-stats-auth\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801201 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-tls\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e63df1c-39c0-4610-87dd-9772a75ddac9-auth-proxy-config\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801246 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-client-ca\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801275 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d5189b2-b3f9-464a-b267-6e70a2687f99-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801295 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-bound-sa-token\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801311 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d5189b2-b3f9-464a-b267-6e70a2687f99-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91a19e46-bca3-43f1-a0f6-0d8805a405db-trusted-ca\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801354 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbf2z\" (UniqueName: \"kubernetes.io/projected/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-kube-api-access-sbf2z\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801387 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzckp\" (UniqueName: \"kubernetes.io/projected/2714766b-33f3-4280-80c2-3e3c8cead5cd-kube-api-access-vzckp\") pod \"cluster-samples-operator-665b6dd947-8brbj\" (UID: \"2714766b-33f3-4280-80c2-3e3c8cead5cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91a19e46-bca3-43f1-a0f6-0d8805a405db-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801423 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nll5\" (UniqueName: \"kubernetes.io/projected/398018f7-8c31-40f9-bd6a-170564176a58-kube-api-access-9nll5\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801439 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5e63df1c-39c0-4610-87dd-9772a75ddac9-machine-approver-tls\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801455 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfj7w\" (UniqueName: \"kubernetes.io/projected/5e63df1c-39c0-4610-87dd-9772a75ddac9-kube-api-access-nfj7w\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqpj6\" (UniqueName: \"kubernetes.io/projected/afbd04f3-97c3-46d0-8b5d-17c630f20f42-kube-api-access-rqpj6\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.801498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: E1003 12:51:48.803437 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:49.3034194 +0000 UTC m=+117.707317235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.803583 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.804123 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.804218 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.804262 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-config\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.804289 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-serving-cert\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.804702 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-certificates\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.804773 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-default-certificate\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.804845 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.804933 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc49x\" (UniqueName: \"kubernetes.io/projected/91a19e46-bca3-43f1-a0f6-0d8805a405db-kube-api-access-xc49x\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.805121 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-policies\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.805195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2982d523-afe6-4ab4-9778-5dbe578a243b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.805281 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.805446 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d5189b2-b3f9-464a-b267-6e70a2687f99-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.805835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjfd2\" (UniqueName: \"kubernetes.io/projected/3e3f372c-8948-4d84-aee2-441d77e3201a-kube-api-access-tjfd2\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.806216 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-config\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.806267 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmb7d\" (UniqueName: \"kubernetes.io/projected/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-kube-api-access-xmb7d\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.806824 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2714766b-33f3-4280-80c2-3e3c8cead5cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8brbj\" (UID: \"2714766b-33f3-4280-80c2-3e3c8cead5cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.806936 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2982d523-afe6-4ab4-9778-5dbe578a243b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.806982 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/398018f7-8c31-40f9-bd6a-170564176a58-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807011 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807049 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-trusted-ca\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v5c9\" (UniqueName: \"kubernetes.io/projected/0d5189b2-b3f9-464a-b267-6e70a2687f99-kube-api-access-4v5c9\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807114 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-metrics-certs\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807146 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-config\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807205 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m42qn\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-kube-api-access-m42qn\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91a19e46-bca3-43f1-a0f6-0d8805a405db-metrics-tls\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807281 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e63df1c-39c0-4610-87dd-9772a75ddac9-config\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807331 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807371 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afbd04f3-97c3-46d0-8b5d-17c630f20f42-service-ca-bundle\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807424 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807450 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807486 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-config\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807516 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-trusted-ca\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-dir\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807564 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807596 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.807652 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.810287 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.816906 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: W1003 12:51:48.818956 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5971c52_f20d_40a2_9e80_e1c02e83cec0.slice/crio-ab105f069588956fd12f5ff29b2c1578b50e1ae33b295d37c737f7aa12f6c87b WatchSource:0}: Error finding container ab105f069588956fd12f5ff29b2c1578b50e1ae33b295d37c737f7aa12f6c87b: Status 404 returned error can't find the container with id ab105f069588956fd12f5ff29b2c1578b50e1ae33b295d37c737f7aa12f6c87b Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.835352 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.841797 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.856763 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.880223 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.897884 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.911977 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.912229 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-mountpoint-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.912255 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9b871c7-1b7d-4b51-ae32-c179956c4de7-proxy-tls\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.912835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a156df7d-5cb7-4d30-b183-90c66b7f9009-webhook-cert\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.912888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.912914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d5189b2-b3f9-464a-b267-6e70a2687f99-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.912941 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e381f38c-5571-4221-8661-2ea67e6e2a52-signing-cabundle\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqk68\" (UniqueName: \"kubernetes.io/projected/2edf0825-d4bd-4a22-a65f-be54b0502600-kube-api-access-mqk68\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjfd2\" (UniqueName: \"kubernetes.io/projected/3e3f372c-8948-4d84-aee2-441d77e3201a-kube-api-access-tjfd2\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913074 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a156df7d-5cb7-4d30-b183-90c66b7f9009-apiservice-cert\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-config\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913197 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9cp\" (UniqueName: \"kubernetes.io/projected/380dd699-f44d-4294-81c0-b26e75d00678-kube-api-access-jd9cp\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913399 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvxrp\" (UniqueName: \"kubernetes.io/projected/44f09fd4-533a-4ed1-b31a-be8f976b2855-kube-api-access-nvxrp\") pod \"ingress-canary-r2b97\" (UID: \"44f09fd4-533a-4ed1-b31a-be8f976b2855\") " pod="openshift-ingress-canary/ingress-canary-r2b97" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913425 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmb7d\" (UniqueName: \"kubernetes.io/projected/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-kube-api-access-xmb7d\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2714766b-33f3-4280-80c2-3e3c8cead5cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8brbj\" (UID: \"2714766b-33f3-4280-80c2-3e3c8cead5cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913518 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a478f682-ff2f-4920-b535-24b1675ce2c7-config-volume\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2982d523-afe6-4ab4-9778-5dbe578a243b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913567 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/398018f7-8c31-40f9-bd6a-170564176a58-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913591 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913775 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-trusted-ca\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913824 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v5c9\" (UniqueName: \"kubernetes.io/projected/0d5189b2-b3f9-464a-b267-6e70a2687f99-kube-api-access-4v5c9\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913893 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913918 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-config\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913942 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-metrics-certs\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.913966 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8628d57e-47fc-4269-b96f-7e04ebfd320d-images\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914036 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m42qn\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-kube-api-access-m42qn\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91a19e46-bca3-43f1-a0f6-0d8805a405db-metrics-tls\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz7pr\" (UniqueName: \"kubernetes.io/projected/c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf-kube-api-access-bz7pr\") pod \"multus-admission-controller-857f4d67dd-vnd7f\" (UID: \"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914136 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e63df1c-39c0-4610-87dd-9772a75ddac9-config\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a156df7d-5cb7-4d30-b183-90c66b7f9009-tmpfs\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914207 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtgzp\" (UniqueName: \"kubernetes.io/projected/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-kube-api-access-dtgzp\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afbd04f3-97c3-46d0-8b5d-17c630f20f42-service-ca-bundle\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914350 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.914385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.915526 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-config\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: E1003 12:51:48.916604 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:49.416580936 +0000 UTC m=+117.820479001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.917332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e63df1c-39c0-4610-87dd-9772a75ddac9-config\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.917577 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-config\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-config\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918474 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9b871c7-1b7d-4b51-ae32-c179956c4de7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918503 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmpk\" (UniqueName: \"kubernetes.io/projected/ec276030-49bd-4751-93f2-456157bd157d-kube-api-access-ccmpk\") pod \"control-plane-machine-set-operator-78cbb6b69f-pbd2f\" (UID: \"ec276030-49bd-4751-93f2-456157bd157d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918545 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/289b8912-7a01-4dc3-aec2-6082ddbe1698-srv-cert\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918599 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-trusted-ca\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918625 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-dir\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918671 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918760 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918785 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt2ft\" (UniqueName: \"kubernetes.io/projected/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-kube-api-access-bt2ft\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918814 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-serving-cert\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918946 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-stats-auth\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-tls\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.918995 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e63df1c-39c0-4610-87dd-9772a75ddac9-auth-proxy-config\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.919019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62126d09-dac2-4f55-9bdf-f07437e37b6f-serving-cert\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.919036 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afbd04f3-97c3-46d0-8b5d-17c630f20f42-service-ca-bundle\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.919043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/380dd699-f44d-4294-81c0-b26e75d00678-config-volume\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.919032 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.919999 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: E1003 12:51:48.920180 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:49.420156834 +0000 UTC m=+117.824054839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.921373 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.921718 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-config\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.924737 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d5189b2-b3f9-464a-b267-6e70a2687f99-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.924858 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-dir\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.926142 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.926943 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e63df1c-39c0-4610-87dd-9772a75ddac9-auth-proxy-config\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.927189 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.927806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/398018f7-8c31-40f9-bd6a-170564176a58-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.928139 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44f09fd4-533a-4ed1-b31a-be8f976b2855-cert\") pod \"ingress-canary-r2b97\" (UID: \"44f09fd4-533a-4ed1-b31a-be8f976b2855\") " pod="openshift-ingress-canary/ingress-canary-r2b97" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.928168 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vnd7f\" (UID: \"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.928220 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxqkh\" (UniqueName: \"kubernetes.io/projected/ed078080-50cf-44fb-ade6-16575106862c-kube-api-access-wxqkh\") pod \"package-server-manager-789f6589d5-5csg7\" (UID: \"ed078080-50cf-44fb-ade6-16575106862c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.928245 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wm2\" (UniqueName: \"kubernetes.io/projected/0c1ee82c-3d26-42f8-b083-d73f7e25448f-kube-api-access-m2wm2\") pod \"migrator-59844c95c7-xxhzg\" (UID: \"0c1ee82c-3d26-42f8-b083-d73f7e25448f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.928267 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b901edd-ff16-4b32-a22d-5e8feda7f19b-node-bootstrap-token\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-client-ca\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d5189b2-b3f9-464a-b267-6e70a2687f99-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929278 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8tgh\" (UniqueName: \"kubernetes.io/projected/8628d57e-47fc-4269-b96f-7e04ebfd320d-kube-api-access-x8tgh\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929373 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-metrics-certs\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929413 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-bound-sa-token\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n42k\" (UniqueName: \"kubernetes.io/projected/289b8912-7a01-4dc3-aec2-6082ddbe1698-kube-api-access-6n42k\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929747 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d5189b2-b3f9-464a-b267-6e70a2687f99-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929771 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91a19e46-bca3-43f1-a0f6-0d8805a405db-trusted-ca\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929834 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929854 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbf2z\" (UniqueName: \"kubernetes.io/projected/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-kube-api-access-sbf2z\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a478f682-ff2f-4920-b535-24b1675ce2c7-secret-volume\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-socket-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929916 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929937 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzckp\" (UniqueName: \"kubernetes.io/projected/2714766b-33f3-4280-80c2-3e3c8cead5cd-kube-api-access-vzckp\") pod \"cluster-samples-operator-665b6dd947-8brbj\" (UID: \"2714766b-33f3-4280-80c2-3e3c8cead5cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.929974 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e381f38c-5571-4221-8661-2ea67e6e2a52-signing-key\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930563 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-947t2\" (UniqueName: \"kubernetes.io/projected/62126d09-dac2-4f55-9bdf-f07437e37b6f-kube-api-access-947t2\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930629 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzm62\" (UniqueName: \"kubernetes.io/projected/e381f38c-5571-4221-8661-2ea67e6e2a52-kube-api-access-hzm62\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930684 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62126d09-dac2-4f55-9bdf-f07437e37b6f-config\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91a19e46-bca3-43f1-a0f6-0d8805a405db-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930771 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b901edd-ff16-4b32-a22d-5e8feda7f19b-certs\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-csi-data-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930854 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nll5\" (UniqueName: \"kubernetes.io/projected/398018f7-8c31-40f9-bd6a-170564176a58-kube-api-access-9nll5\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/289b8912-7a01-4dc3-aec2-6082ddbe1698-profile-collector-cert\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.930930 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2sb4\" (UniqueName: \"kubernetes.io/projected/a156df7d-5cb7-4d30-b183-90c66b7f9009-kube-api-access-q2sb4\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.931007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5e63df1c-39c0-4610-87dd-9772a75ddac9-machine-approver-tls\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.932330 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-client-ca\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.933051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfj7w\" (UniqueName: \"kubernetes.io/projected/5e63df1c-39c0-4610-87dd-9772a75ddac9-kube-api-access-nfj7w\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.933093 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqpj6\" (UniqueName: \"kubernetes.io/projected/afbd04f3-97c3-46d0-8b5d-17c630f20f42-kube-api-access-rqpj6\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.933153 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-plugins-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.933187 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsbdh\" (UniqueName: \"kubernetes.io/projected/d9b871c7-1b7d-4b51-ae32-c179956c4de7-kube-api-access-qsbdh\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.933996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.934061 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.934104 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mfl\" (UniqueName: \"kubernetes.io/projected/3b901edd-ff16-4b32-a22d-5e8feda7f19b-kube-api-access-t2mfl\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.934134 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8628d57e-47fc-4269-b96f-7e04ebfd320d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.934193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-certificates\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.934222 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.934251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-config\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.934278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-serving-cert\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.934306 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwhld\" (UniqueName: \"kubernetes.io/projected/cb5bb7ba-79c7-4251-8025-68e5c9997447-kube-api-access-pwhld\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.937847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5e63df1c-39c0-4610-87dd-9772a75ddac9-machine-approver-tls\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.938195 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.938272 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2982d523-afe6-4ab4-9778-5dbe578a243b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.939075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91a19e46-bca3-43f1-a0f6-0d8805a405db-trusted-ca\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.939865 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2714766b-33f3-4280-80c2-3e3c8cead5cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8brbj\" (UID: \"2714766b-33f3-4280-80c2-3e3c8cead5cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.940684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91a19e46-bca3-43f1-a0f6-0d8805a405db-metrics-tls\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.941297 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.942377 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.944773 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8628d57e-47fc-4269-b96f-7e04ebfd320d-proxy-tls\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.944854 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec276030-49bd-4751-93f2-456157bd157d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pbd2f\" (UID: \"ec276030-49bd-4751-93f2-456157bd157d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.944881 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdgvh\" (UniqueName: \"kubernetes.io/projected/a478f682-ff2f-4920-b535-24b1675ce2c7-kube-api-access-bdgvh\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.945167 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.945303 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-default-certificate\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.945679 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-srv-cert\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.946068 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc49x\" (UniqueName: \"kubernetes.io/projected/91a19e46-bca3-43f1-a0f6-0d8805a405db-kube-api-access-xc49x\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.946193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed078080-50cf-44fb-ade6-16575106862c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5csg7\" (UID: \"ed078080-50cf-44fb-ade6-16575106862c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.946226 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/380dd699-f44d-4294-81c0-b26e75d00678-metrics-tls\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.946502 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.946530 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.946564 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2982d523-afe6-4ab4-9778-5dbe578a243b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.946617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-policies\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.946661 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-registration-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.947124 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2982d523-afe6-4ab4-9778-5dbe578a243b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.947338 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-policies\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.949121 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-stats-auth\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.949499 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.951774 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.952362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.956333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-tls\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.957458 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d5189b2-b3f9-464a-b267-6e70a2687f99-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.957588 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-certificates\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.965174 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.965895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.966084 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-serving-cert\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.966319 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.963733 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-serving-cert\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.967356 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-config\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.972727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/afbd04f3-97c3-46d0-8b5d-17c630f20f42-default-certificate\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.973072 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-trusted-ca\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.978450 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjfd2\" (UniqueName: \"kubernetes.io/projected/3e3f372c-8948-4d84-aee2-441d77e3201a-kube-api-access-tjfd2\") pod \"oauth-openshift-558db77b4-q777j\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:48 crc kubenswrapper[4962]: I1003 12:51:48.983797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-trusted-ca\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.016921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v5c9\" (UniqueName: \"kubernetes.io/projected/0d5189b2-b3f9-464a-b267-6e70a2687f99-kube-api-access-4v5c9\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.020392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r628t\" (UID: \"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.020562 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wgl5v"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.039221 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmb7d\" (UniqueName: \"kubernetes.io/projected/d66f8fa5-4d41-40e1-87aa-e2b08e132cbc-kube-api-access-xmb7d\") pod \"console-operator-58897d9998-hszb5\" (UID: \"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc\") " pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.071862 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072123 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8628d57e-47fc-4269-b96f-7e04ebfd320d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwhld\" (UniqueName: \"kubernetes.io/projected/cb5bb7ba-79c7-4251-8025-68e5c9997447-kube-api-access-pwhld\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdgvh\" (UniqueName: \"kubernetes.io/projected/a478f682-ff2f-4920-b535-24b1675ce2c7-kube-api-access-bdgvh\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072218 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8628d57e-47fc-4269-b96f-7e04ebfd320d-proxy-tls\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec276030-49bd-4751-93f2-456157bd157d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pbd2f\" (UID: \"ec276030-49bd-4751-93f2-456157bd157d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-srv-cert\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072298 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed078080-50cf-44fb-ade6-16575106862c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5csg7\" (UID: \"ed078080-50cf-44fb-ade6-16575106862c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/380dd699-f44d-4294-81c0-b26e75d00678-metrics-tls\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072356 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072387 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-registration-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072430 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9b871c7-1b7d-4b51-ae32-c179956c4de7-proxy-tls\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072450 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a156df7d-5cb7-4d30-b183-90c66b7f9009-webhook-cert\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072471 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-mountpoint-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072500 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e381f38c-5571-4221-8661-2ea67e6e2a52-signing-cabundle\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072535 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqk68\" (UniqueName: \"kubernetes.io/projected/2edf0825-d4bd-4a22-a65f-be54b0502600-kube-api-access-mqk68\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072558 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a156df7d-5cb7-4d30-b183-90c66b7f9009-apiservice-cert\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072579 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd9cp\" (UniqueName: \"kubernetes.io/projected/380dd699-f44d-4294-81c0-b26e75d00678-kube-api-access-jd9cp\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072606 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072631 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvxrp\" (UniqueName: \"kubernetes.io/projected/44f09fd4-533a-4ed1-b31a-be8f976b2855-kube-api-access-nvxrp\") pod \"ingress-canary-r2b97\" (UID: \"44f09fd4-533a-4ed1-b31a-be8f976b2855\") " pod="openshift-ingress-canary/ingress-canary-r2b97" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072672 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a478f682-ff2f-4920-b535-24b1675ce2c7-config-volume\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8628d57e-47fc-4269-b96f-7e04ebfd320d-images\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz7pr\" (UniqueName: \"kubernetes.io/projected/c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf-kube-api-access-bz7pr\") pod \"multus-admission-controller-857f4d67dd-vnd7f\" (UID: \"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072754 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a156df7d-5cb7-4d30-b183-90c66b7f9009-tmpfs\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072774 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtgzp\" (UniqueName: \"kubernetes.io/projected/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-kube-api-access-dtgzp\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072799 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072839 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9b871c7-1b7d-4b51-ae32-c179956c4de7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072862 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmpk\" (UniqueName: \"kubernetes.io/projected/ec276030-49bd-4751-93f2-456157bd157d-kube-api-access-ccmpk\") pod \"control-plane-machine-set-operator-78cbb6b69f-pbd2f\" (UID: \"ec276030-49bd-4751-93f2-456157bd157d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/289b8912-7a01-4dc3-aec2-6082ddbe1698-srv-cert\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt2ft\" (UniqueName: \"kubernetes.io/projected/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-kube-api-access-bt2ft\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072949 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/380dd699-f44d-4294-81c0-b26e75d00678-config-volume\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072972 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62126d09-dac2-4f55-9bdf-f07437e37b6f-serving-cert\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.072997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44f09fd4-533a-4ed1-b31a-be8f976b2855-cert\") pod \"ingress-canary-r2b97\" (UID: \"44f09fd4-533a-4ed1-b31a-be8f976b2855\") " pod="openshift-ingress-canary/ingress-canary-r2b97" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073018 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vnd7f\" (UID: \"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073043 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxqkh\" (UniqueName: \"kubernetes.io/projected/ed078080-50cf-44fb-ade6-16575106862c-kube-api-access-wxqkh\") pod \"package-server-manager-789f6589d5-5csg7\" (UID: \"ed078080-50cf-44fb-ade6-16575106862c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073066 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wm2\" (UniqueName: \"kubernetes.io/projected/0c1ee82c-3d26-42f8-b083-d73f7e25448f-kube-api-access-m2wm2\") pod \"migrator-59844c95c7-xxhzg\" (UID: \"0c1ee82c-3d26-42f8-b083-d73f7e25448f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b901edd-ff16-4b32-a22d-5e8feda7f19b-node-bootstrap-token\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8tgh\" (UniqueName: \"kubernetes.io/projected/8628d57e-47fc-4269-b96f-7e04ebfd320d-kube-api-access-x8tgh\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073152 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n42k\" (UniqueName: \"kubernetes.io/projected/289b8912-7a01-4dc3-aec2-6082ddbe1698-kube-api-access-6n42k\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073183 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a478f682-ff2f-4920-b535-24b1675ce2c7-secret-volume\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-socket-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073238 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073260 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-947t2\" (UniqueName: \"kubernetes.io/projected/62126d09-dac2-4f55-9bdf-f07437e37b6f-kube-api-access-947t2\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e381f38c-5571-4221-8661-2ea67e6e2a52-signing-key\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073309 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzm62\" (UniqueName: \"kubernetes.io/projected/e381f38c-5571-4221-8661-2ea67e6e2a52-kube-api-access-hzm62\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62126d09-dac2-4f55-9bdf-f07437e37b6f-config\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073361 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b901edd-ff16-4b32-a22d-5e8feda7f19b-certs\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073384 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-csi-data-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073412 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/289b8912-7a01-4dc3-aec2-6082ddbe1698-profile-collector-cert\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2sb4\" (UniqueName: \"kubernetes.io/projected/a156df7d-5cb7-4d30-b183-90c66b7f9009-kube-api-access-q2sb4\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsbdh\" (UniqueName: \"kubernetes.io/projected/d9b871c7-1b7d-4b51-ae32-c179956c4de7-kube-api-access-qsbdh\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-plugins-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.073531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mfl\" (UniqueName: \"kubernetes.io/projected/3b901edd-ff16-4b32-a22d-5e8feda7f19b-kube-api-access-t2mfl\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.073816 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:49.573797681 +0000 UTC m=+117.977695516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.076167 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-mountpoint-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.077001 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e381f38c-5571-4221-8661-2ea67e6e2a52-signing-cabundle\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.077794 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-socket-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.079418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/380dd699-f44d-4294-81c0-b26e75d00678-config-volume\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.081410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.081436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-csi-data-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.081918 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.083471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9b871c7-1b7d-4b51-ae32-c179956c4de7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.083817 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-plugins-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.084622 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a478f682-ff2f-4920-b535-24b1675ce2c7-config-volume\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.086232 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a156df7d-5cb7-4d30-b183-90c66b7f9009-tmpfs\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.087989 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8628d57e-47fc-4269-b96f-7e04ebfd320d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.088266 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2edf0825-d4bd-4a22-a65f-be54b0502600-registration-dir\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.089687 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.092219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.093019 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62126d09-dac2-4f55-9bdf-f07437e37b6f-config\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.093446 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a478f682-ff2f-4920-b535-24b1675ce2c7-secret-volume\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.094576 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a156df7d-5cb7-4d30-b183-90c66b7f9009-apiservice-cert\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.097002 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.097056 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hcwtn"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.098497 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.101116 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.113219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec276030-49bd-4751-93f2-456157bd157d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pbd2f\" (UID: \"ec276030-49bd-4751-93f2-456157bd157d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.113548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-srv-cert\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.113730 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.114058 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62126d09-dac2-4f55-9bdf-f07437e37b6f-serving-cert\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.114110 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a156df7d-5cb7-4d30-b183-90c66b7f9009-webhook-cert\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.114138 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/289b8912-7a01-4dc3-aec2-6082ddbe1698-srv-cert\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.114134 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/289b8912-7a01-4dc3-aec2-6082ddbe1698-profile-collector-cert\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.114387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8628d57e-47fc-4269-b96f-7e04ebfd320d-images\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.114679 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b901edd-ff16-4b32-a22d-5e8feda7f19b-certs\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.118604 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44f09fd4-533a-4ed1-b31a-be8f976b2855-cert\") pod \"ingress-canary-r2b97\" (UID: \"44f09fd4-533a-4ed1-b31a-be8f976b2855\") " pod="openshift-ingress-canary/ingress-canary-r2b97" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.118950 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.123147 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vnd7f\" (UID: \"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.124259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8628d57e-47fc-4269-b96f-7e04ebfd320d-proxy-tls\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.125244 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed078080-50cf-44fb-ade6-16575106862c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5csg7\" (UID: \"ed078080-50cf-44fb-ade6-16575106862c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.125602 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.131013 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b901edd-ff16-4b32-a22d-5e8feda7f19b-node-bootstrap-token\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.133293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.133437 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9b871c7-1b7d-4b51-ae32-c179956c4de7-proxy-tls\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.135011 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e381f38c-5571-4221-8661-2ea67e6e2a52-signing-key\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.135152 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/380dd699-f44d-4294-81c0-b26e75d00678-metrics-tls\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.135518 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-bound-sa-token\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.135703 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d5189b2-b3f9-464a-b267-6e70a2687f99-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gjzdl\" (UID: \"0d5189b2-b3f9-464a-b267-6e70a2687f99\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.136010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91a19e46-bca3-43f1-a0f6-0d8805a405db-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.142937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nll5\" (UniqueName: \"kubernetes.io/projected/398018f7-8c31-40f9-bd6a-170564176a58-kube-api-access-9nll5\") pod \"route-controller-manager-6576b87f9c-tvdkb\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.149375 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfj7w\" (UniqueName: \"kubernetes.io/projected/5e63df1c-39c0-4610-87dd-9772a75ddac9-kube-api-access-nfj7w\") pod \"machine-approver-56656f9798-rbkn2\" (UID: \"5e63df1c-39c0-4610-87dd-9772a75ddac9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.158483 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqpj6\" (UniqueName: \"kubernetes.io/projected/afbd04f3-97c3-46d0-8b5d-17c630f20f42-kube-api-access-rqpj6\") pod \"router-default-5444994796-wlnbb\" (UID: \"afbd04f3-97c3-46d0-8b5d-17c630f20f42\") " pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.174270 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-t4sln"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.184788 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.185289 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:49.685272611 +0000 UTC m=+118.089170446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.191518 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m42qn\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-kube-api-access-m42qn\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.203410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzckp\" (UniqueName: \"kubernetes.io/projected/2714766b-33f3-4280-80c2-3e3c8cead5cd-kube-api-access-vzckp\") pod \"cluster-samples-operator-665b6dd947-8brbj\" (UID: \"2714766b-33f3-4280-80c2-3e3c8cead5cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.235794 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc49x\" (UniqueName: \"kubernetes.io/projected/91a19e46-bca3-43f1-a0f6-0d8805a405db-kube-api-access-xc49x\") pod \"ingress-operator-5b745b69d9-5vx4v\" (UID: \"91a19e46-bca3-43f1-a0f6-0d8805a405db\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.242661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbf2z\" (UniqueName: \"kubernetes.io/projected/ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa-kube-api-access-sbf2z\") pod \"authentication-operator-69f744f599-4z5v6\" (UID: \"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.250652 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.286295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.286772 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:49.786757326 +0000 UTC m=+118.190655161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.296945 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mfl\" (UniqueName: \"kubernetes.io/projected/3b901edd-ff16-4b32-a22d-5e8feda7f19b-kube-api-access-t2mfl\") pod \"machine-config-server-w5ndz\" (UID: \"3b901edd-ff16-4b32-a22d-5e8feda7f19b\") " pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.305002 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmpk\" (UniqueName: \"kubernetes.io/projected/ec276030-49bd-4751-93f2-456157bd157d-kube-api-access-ccmpk\") pod \"control-plane-machine-set-operator-78cbb6b69f-pbd2f\" (UID: \"ec276030-49bd-4751-93f2-456157bd157d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.335462 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt2ft\" (UniqueName: \"kubernetes.io/projected/1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b-kube-api-access-bt2ft\") pod \"catalog-operator-68c6474976-bfcs7\" (UID: \"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.336142 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqk68\" (UniqueName: \"kubernetes.io/projected/2edf0825-d4bd-4a22-a65f-be54b0502600-kube-api-access-mqk68\") pod \"csi-hostpathplugin-5hcj2\" (UID: \"2edf0825-d4bd-4a22-a65f-be54b0502600\") " pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.342370 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.342388 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-w5ndz" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.349413 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.354591 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wm2\" (UniqueName: \"kubernetes.io/projected/0c1ee82c-3d26-42f8-b083-d73f7e25448f-kube-api-access-m2wm2\") pod \"migrator-59844c95c7-xxhzg\" (UID: \"0c1ee82c-3d26-42f8-b083-d73f7e25448f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.359453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.375987 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2sb4\" (UniqueName: \"kubernetes.io/projected/a156df7d-5cb7-4d30-b183-90c66b7f9009-kube-api-access-q2sb4\") pod \"packageserver-d55dfcdfc-hdljv\" (UID: \"a156df7d-5cb7-4d30-b183-90c66b7f9009\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.387980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.388407 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:49.888391785 +0000 UTC m=+118.292289620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.390510 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsbdh\" (UniqueName: \"kubernetes.io/projected/d9b871c7-1b7d-4b51-ae32-c179956c4de7-kube-api-access-qsbdh\") pod \"machine-config-controller-84d6567774-llhkr\" (UID: \"d9b871c7-1b7d-4b51-ae32-c179956c4de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.401852 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.415158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz7pr\" (UniqueName: \"kubernetes.io/projected/c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf-kube-api-access-bz7pr\") pod \"multus-admission-controller-857f4d67dd-vnd7f\" (UID: \"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.420808 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.421931 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd8km"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.450126 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.457004 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtgzp\" (UniqueName: \"kubernetes.io/projected/51b4dfee-5b09-40cb-a284-3d1d16c03cd3-kube-api-access-dtgzp\") pod \"kube-storage-version-migrator-operator-b67b599dd-ct99q\" (UID: \"51b4dfee-5b09-40cb-a284-3d1d16c03cd3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.459779 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.466817 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd9cp\" (UniqueName: \"kubernetes.io/projected/380dd699-f44d-4294-81c0-b26e75d00678-kube-api-access-jd9cp\") pod \"dns-default-v8m25\" (UID: \"380dd699-f44d-4294-81c0-b26e75d00678\") " pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.489177 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.489622 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:49.989605883 +0000 UTC m=+118.393503718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.501009 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-947t2\" (UniqueName: \"kubernetes.io/projected/62126d09-dac2-4f55-9bdf-f07437e37b6f-kube-api-access-947t2\") pod \"service-ca-operator-777779d784-b2vsf\" (UID: \"62126d09-dac2-4f55-9bdf-f07437e37b6f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.513068 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.529931 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q777j"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.530288 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.536673 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvxrp\" (UniqueName: \"kubernetes.io/projected/44f09fd4-533a-4ed1-b31a-be8f976b2855-kube-api-access-nvxrp\") pod \"ingress-canary-r2b97\" (UID: \"44f09fd4-533a-4ed1-b31a-be8f976b2855\") " pod="openshift-ingress-canary/ingress-canary-r2b97" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.545937 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.561317 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdgvh\" (UniqueName: \"kubernetes.io/projected/a478f682-ff2f-4920-b535-24b1675ce2c7-kube-api-access-bdgvh\") pod \"collect-profiles-29324925-rjxv2\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.563206 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.566881 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.580605 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwhld\" (UniqueName: \"kubernetes.io/projected/cb5bb7ba-79c7-4251-8025-68e5c9997447-kube-api-access-pwhld\") pod \"marketplace-operator-79b997595-wdw2z\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.580785 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8tgh\" (UniqueName: \"kubernetes.io/projected/8628d57e-47fc-4269-b96f-7e04ebfd320d-kube-api-access-x8tgh\") pod \"machine-config-operator-74547568cd-wbchr\" (UID: \"8628d57e-47fc-4269-b96f-7e04ebfd320d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.583966 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.589343 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.590735 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.591318 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.091297344 +0000 UTC m=+118.495195179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.602152 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2b97" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.605999 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxqkh\" (UniqueName: \"kubernetes.io/projected/ed078080-50cf-44fb-ade6-16575106862c-kube-api-access-wxqkh\") pod \"package-server-manager-789f6589d5-5csg7\" (UID: \"ed078080-50cf-44fb-ade6-16575106862c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.606676 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.608429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n42k\" (UniqueName: \"kubernetes.io/projected/289b8912-7a01-4dc3-aec2-6082ddbe1698-kube-api-access-6n42k\") pod \"olm-operator-6b444d44fb-826w7\" (UID: \"289b8912-7a01-4dc3-aec2-6082ddbe1698\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.611420 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.643360 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.645121 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.645685 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hszb5"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.655860 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t"] Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.662481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzm62\" (UniqueName: \"kubernetes.io/projected/e381f38c-5571-4221-8661-2ea67e6e2a52-kube-api-access-hzm62\") pod \"service-ca-9c57cc56f-jfpvr\" (UID: \"e381f38c-5571-4221-8661-2ea67e6e2a52\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.694289 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.19426767 +0000 UTC m=+118.598165505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.693974 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.695928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.696489 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.19644916 +0000 UTC m=+118.600346995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: W1003 12:51:49.722168 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e3f372c_8948_4d84_aee2_441d77e3201a.slice/crio-381397d9330b258a906bab5f805296f4bc8d365f616c2793f30b75763fe2bf6c WatchSource:0}: Error finding container 381397d9330b258a906bab5f805296f4bc8d365f616c2793f30b75763fe2bf6c: Status 404 returned error can't find the container with id 381397d9330b258a906bab5f805296f4bc8d365f616c2793f30b75763fe2bf6c Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.793210 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gtf6m" event={"ID":"a5971c52-f20d-40a2-9e80-e1c02e83cec0","Type":"ContainerStarted","Data":"65775113e7e65b47fd0d5f9b736fbbb3083ee33d817af884276f4d64681b2d13"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.793256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gtf6m" event={"ID":"a5971c52-f20d-40a2-9e80-e1c02e83cec0","Type":"ContainerStarted","Data":"ab105f069588956fd12f5ff29b2c1578b50e1ae33b295d37c737f7aa12f6c87b"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.796558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.796863 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gtf6m" Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.797951 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.297926635 +0000 UTC m=+118.701824470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.800778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.801170 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.301158804 +0000 UTC m=+118.705056639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.801534 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wgl5v" event={"ID":"796d19ea-1d92-4dcb-9e10-305ddbe1b283","Type":"ContainerStarted","Data":"0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.801576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wgl5v" event={"ID":"796d19ea-1d92-4dcb-9e10-305ddbe1b283","Type":"ContainerStarted","Data":"b39998ea4c7a9f15afd415f99d1994c5c178eb3bce843b09b94dc9f71581ed88"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.802497 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.806016 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-w5ndz" event={"ID":"3b901edd-ff16-4b32-a22d-5e8feda7f19b","Type":"ContainerStarted","Data":"8a2ea6b7025d93457f3808ec4d386feb2a5a8b86add9b7a161ebd03ece977d7e"} Oct 03 12:51:49 crc kubenswrapper[4962]: W1003 12:51:49.811166 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf64523a2_c2b2_4f8d_9a51_74a3a64f8ca4.slice/crio-993c69903bf3cdaa9fae8b984e51b6a17781de15b7747551230a04e37d40998c WatchSource:0}: Error finding container 993c69903bf3cdaa9fae8b984e51b6a17781de15b7747551230a04e37d40998c: Status 404 returned error can't find the container with id 993c69903bf3cdaa9fae8b984e51b6a17781de15b7747551230a04e37d40998c Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.820221 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-gtf6m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.820280 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gtf6m" podUID="a5971c52-f20d-40a2-9e80-e1c02e83cec0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.835508 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.840876 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.852921 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.857725 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" event={"ID":"3e3f372c-8948-4d84-aee2-441d77e3201a","Type":"ContainerStarted","Data":"381397d9330b258a906bab5f805296f4bc8d365f616c2793f30b75763fe2bf6c"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.862022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" event={"ID":"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a","Type":"ContainerStarted","Data":"a1d3c89385631adfb76470aa7dc0197bfc741efe5728d61500ab665ae9602540"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.863729 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" event={"ID":"df5a2d1b-1d06-4201-8546-1a6f67a2511e","Type":"ContainerStarted","Data":"82b77f4435f8d2f760373372201c8ac98c55feaee7b67be7005713441db75b68"} Oct 03 12:51:49 crc kubenswrapper[4962]: W1003 12:51:49.870706 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e63df1c_39c0_4610_87dd_9772a75ddac9.slice/crio-57482205eb5b5b36efcb6a448e14b8d5de220d91cecd53efcfcf8fac15bfc87d WatchSource:0}: Error finding container 57482205eb5b5b36efcb6a448e14b8d5de220d91cecd53efcfcf8fac15bfc87d: Status 404 returned error can't find the container with id 57482205eb5b5b36efcb6a448e14b8d5de220d91cecd53efcfcf8fac15bfc87d Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.874456 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.897804 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" event={"ID":"7b83dbab-ab28-4769-b812-be82be9db67e","Type":"ContainerStarted","Data":"d14f3030b6236d8972a9f3edab5c558a5c52e15d77ae48b5c3bd7741a59c5ce8"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.902368 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:49 crc kubenswrapper[4962]: E1003 12:51:49.903480 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.403436261 +0000 UTC m=+118.807334106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.932129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" event={"ID":"693d9472-5b36-4df8-a6fb-d3e3aece0cdb","Type":"ContainerStarted","Data":"0fa6fb1c41779dbae342b3c2724a4ab13f1a7530cfcf6c12c2e37060aec5e28c"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.948264 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" event={"ID":"994f46cf-ed06-420d-a2fd-52547aadd0ce","Type":"ContainerStarted","Data":"aa1c8953d2d4885b24cf296d93a89f0262e6233f967251bdc82329f8f993be60"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.963066 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" event={"ID":"bf3d2489-9ec1-4479-9f8c-a519812581f8","Type":"ContainerStarted","Data":"b0fbca94698633db2daa697af6b6820fefc6e48eefccb6a72a2bd822e49fb765"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.964659 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" event={"ID":"bb48dfa2-82c3-4c23-b66d-77ac0166326f","Type":"ContainerStarted","Data":"d880309186b6ffcd86762de1a51de4156c31f093028fc4b665eadf132c1919fc"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.967329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" event={"ID":"003637f9-ebbe-4587-bdf0-071bfca642dd","Type":"ContainerStarted","Data":"02424a943c73ebda1e3de77e43546b0fc213691ff280a896a48fdcd8e5a7afe9"} Oct 03 12:51:49 crc kubenswrapper[4962]: I1003 12:51:49.992782 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.004738 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.005358 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.505342368 +0000 UTC m=+118.909240203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.061024 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl"] Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.076557 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj"] Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.108237 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.111136 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.611111971 +0000 UTC m=+119.015009806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.177186 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb"] Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.209731 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.211498 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.711482755 +0000 UTC m=+119.115380590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.268697 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gtf6m" podStartSLOduration=97.268676875 podStartE2EDuration="1m37.268676875s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:50.237094368 +0000 UTC m=+118.640992213" watchObservedRunningTime="2025-10-03 12:51:50.268676875 +0000 UTC m=+118.672574710" Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.280738 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4z5v6"] Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.312824 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.313365 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.813347801 +0000 UTC m=+119.217245626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.390759 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v"] Oct 03 12:51:50 crc kubenswrapper[4962]: W1003 12:51:50.396775 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398018f7_8c31_40f9_bd6a_170564176a58.slice/crio-960510cddc8a8ebed6a70ad7dc3bc460d3790f0a06fc6157a6807cd6c945951d WatchSource:0}: Error finding container 960510cddc8a8ebed6a70ad7dc3bc460d3790f0a06fc6157a6807cd6c945951d: Status 404 returned error can't find the container with id 960510cddc8a8ebed6a70ad7dc3bc460d3790f0a06fc6157a6807cd6c945951d Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.414849 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.415398 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:50.915378422 +0000 UTC m=+119.319276347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: W1003 12:51:50.417856 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0b79bc_bc68_4ce3_a3ec_bcd3e6b5b9fa.slice/crio-1d8b8405ab27bf7a19f0185976d4506e5ab9a99c3ddd2534cc706cef579d5ebc WatchSource:0}: Error finding container 1d8b8405ab27bf7a19f0185976d4506e5ab9a99c3ddd2534cc706cef579d5ebc: Status 404 returned error can't find the container with id 1d8b8405ab27bf7a19f0185976d4506e5ab9a99c3ddd2534cc706cef579d5ebc Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.485910 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr"] Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.516205 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.516486 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.016387504 +0000 UTC m=+119.420285339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.516626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.516894 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.016881857 +0000 UTC m=+119.420779692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.555305 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pkncv" podStartSLOduration=97.555289631 podStartE2EDuration="1m37.555289631s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:50.553232015 +0000 UTC m=+118.957129850" watchObservedRunningTime="2025-10-03 12:51:50.555289631 +0000 UTC m=+118.959187466" Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.618806 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.619074 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.119060912 +0000 UTC m=+119.522958747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: W1003 12:51:50.678561 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9b871c7_1b7d_4b51_ae32_c179956c4de7.slice/crio-75ba454def70467cbfc50dc21589d14ea7fbb2bc430fcdc90b7a48320b6bbf82 WatchSource:0}: Error finding container 75ba454def70467cbfc50dc21589d14ea7fbb2bc430fcdc90b7a48320b6bbf82: Status 404 returned error can't find the container with id 75ba454def70467cbfc50dc21589d14ea7fbb2bc430fcdc90b7a48320b6bbf82 Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.722835 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.723168 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.223154419 +0000 UTC m=+119.627052254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.782596 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wgl5v" podStartSLOduration=97.782575259 podStartE2EDuration="1m37.782575259s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:50.778135487 +0000 UTC m=+119.182033332" watchObservedRunningTime="2025-10-03 12:51:50.782575259 +0000 UTC m=+119.186473094" Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.830325 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.831155 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.331136762 +0000 UTC m=+119.735034597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.936443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:50 crc kubenswrapper[4962]: E1003 12:51:50.936818 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.436806652 +0000 UTC m=+119.840704487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.937624 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5zs6v" podStartSLOduration=97.937605504 podStartE2EDuration="1m37.937605504s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:50.899778776 +0000 UTC m=+119.303676691" watchObservedRunningTime="2025-10-03 12:51:50.937605504 +0000 UTC m=+119.341503349" Oct 03 12:51:50 crc kubenswrapper[4962]: I1003 12:51:50.996759 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" podStartSLOduration=97.996742207 podStartE2EDuration="1m37.996742207s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:50.993889939 +0000 UTC m=+119.397787774" watchObservedRunningTime="2025-10-03 12:51:50.996742207 +0000 UTC m=+119.400640032" Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.012427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" event={"ID":"5e63df1c-39c0-4610-87dd-9772a75ddac9","Type":"ContainerStarted","Data":"57482205eb5b5b36efcb6a448e14b8d5de220d91cecd53efcfcf8fac15bfc87d"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.020175 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" event={"ID":"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa","Type":"ContainerStarted","Data":"1d8b8405ab27bf7a19f0185976d4506e5ab9a99c3ddd2534cc706cef579d5ebc"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.025922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hszb5" event={"ID":"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc","Type":"ContainerStarted","Data":"9d07bf005780bef664e57c7cabf7c6becffd7a3866ad3d5f648095d794d10c34"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.040529 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.040958 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.54094131 +0000 UTC m=+119.944839145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.098014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" event={"ID":"003637f9-ebbe-4587-bdf0-071bfca642dd","Type":"ContainerStarted","Data":"41b0b1121426564d86cd474ec0ecbeef04bafc564925248a365c8f709787b713"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.101269 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" event={"ID":"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4","Type":"ContainerStarted","Data":"993c69903bf3cdaa9fae8b984e51b6a17781de15b7747551230a04e37d40998c"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.113774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" event={"ID":"7b83dbab-ab28-4769-b812-be82be9db67e","Type":"ContainerStarted","Data":"29b91bc3c7c82cfba1c8608916fecc02fb9ceb1d82fe584a32e21f6052d06197"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.116556 4962 generic.go:334] "Generic (PLEG): container finished" podID="994f46cf-ed06-420d-a2fd-52547aadd0ce" containerID="66d5334066fe7a4d89f2dd22bf9025b825ca5891fea067da3a5ea566d29fd50b" exitCode=0 Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.117466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" event={"ID":"994f46cf-ed06-420d-a2fd-52547aadd0ce","Type":"ContainerDied","Data":"66d5334066fe7a4d89f2dd22bf9025b825ca5891fea067da3a5ea566d29fd50b"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.132827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wlnbb" event={"ID":"afbd04f3-97c3-46d0-8b5d-17c630f20f42","Type":"ContainerStarted","Data":"d2bac61a9fc8e634a06a47f42a617e140a9268e7f89d5d0e6f88575afb5d0c71"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.134445 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" event={"ID":"398018f7-8c31-40f9-bd6a-170564176a58","Type":"ContainerStarted","Data":"960510cddc8a8ebed6a70ad7dc3bc460d3790f0a06fc6157a6807cd6c945951d"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.144393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.144765 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.644751049 +0000 UTC m=+120.048648874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.147784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" event={"ID":"d9b871c7-1b7d-4b51-ae32-c179956c4de7","Type":"ContainerStarted","Data":"75ba454def70467cbfc50dc21589d14ea7fbb2bc430fcdc90b7a48320b6bbf82"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.149064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" event={"ID":"bb48dfa2-82c3-4c23-b66d-77ac0166326f","Type":"ContainerStarted","Data":"3421b909a52767c57d2e421bed9fedea84bef6e41b350840b8b277755a4468bd"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.185852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" event={"ID":"0d5189b2-b3f9-464a-b267-6e70a2687f99","Type":"ContainerStarted","Data":"261e3949308835dd09e6d56b6f2f9bd488e92f5fba850de6bbfee6d4e13e1c03"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.199361 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" event={"ID":"bf3d2489-9ec1-4479-9f8c-a519812581f8","Type":"ContainerStarted","Data":"9ec6cde03eac8b76363e1e71a52bfb8cfe07a2052965f066aeb2bd0004318f83"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.217280 4962 generic.go:334] "Generic (PLEG): container finished" podID="d6b25ec8-0fa2-4d8e-81e3-51e15eee578a" containerID="eb9cf21fe46601693a512f2df19ac465a94aa6dba517b83c611f04f903247c76" exitCode=0 Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.217411 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" event={"ID":"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a","Type":"ContainerDied","Data":"eb9cf21fe46601693a512f2df19ac465a94aa6dba517b83c611f04f903247c76"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.232418 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" event={"ID":"91a19e46-bca3-43f1-a0f6-0d8805a405db","Type":"ContainerStarted","Data":"f0b5f1af4b503ec0618eee35cf4694a853f1abda4be0c590e48f13024116ff71"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.239006 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" event={"ID":"2714766b-33f3-4280-80c2-3e3c8cead5cd","Type":"ContainerStarted","Data":"65c71e0b8cc562eca696c8b17bf7611913a1ea6ced68d8278c6855c743a010e7"} Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.239774 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-gtf6m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.239808 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gtf6m" podUID="a5971c52-f20d-40a2-9e80-e1c02e83cec0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.245941 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.246069 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.746039309 +0000 UTC m=+120.149937134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.246338 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.248171 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.748155297 +0000 UTC m=+120.152053132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.347963 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.348195 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.848161312 +0000 UTC m=+120.252059147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.348664 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.349704 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.849682653 +0000 UTC m=+120.253580678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.399467 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" podStartSLOduration=98.399449758 podStartE2EDuration="1m38.399449758s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:51.392549099 +0000 UTC m=+119.796446934" watchObservedRunningTime="2025-10-03 12:51:51.399449758 +0000 UTC m=+119.803347593" Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.438846 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ndncb" podStartSLOduration=98.438827799 podStartE2EDuration="1m38.438827799s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:51.438670025 +0000 UTC m=+119.842567860" watchObservedRunningTime="2025-10-03 12:51:51.438827799 +0000 UTC m=+119.842725634" Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.451052 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.451562 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:51.951543428 +0000 UTC m=+120.355441273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.556863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.557224 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.057209178 +0000 UTC m=+120.461107013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.583322 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnnwg" podStartSLOduration=98.583305055 podStartE2EDuration="1m38.583305055s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:51.578560274 +0000 UTC m=+119.982458109" watchObservedRunningTime="2025-10-03 12:51:51.583305055 +0000 UTC m=+119.987202890" Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.665606 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.666248 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.16622363 +0000 UTC m=+120.570121465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.772273 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.772833 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.272820066 +0000 UTC m=+120.676717891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.802182 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" podStartSLOduration=98.802163681 podStartE2EDuration="1m38.802163681s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:51.632086803 +0000 UTC m=+120.035984658" watchObservedRunningTime="2025-10-03 12:51:51.802163681 +0000 UTC m=+120.206061516" Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.802465 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7"] Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.813575 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv"] Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.817091 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg"] Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.874287 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.874482 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.374456485 +0000 UTC m=+120.778354320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.874544 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.874898 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.374885867 +0000 UTC m=+120.778783702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: I1003 12:51:51.975159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:51 crc kubenswrapper[4962]: E1003 12:51:51.975563 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.47554598 +0000 UTC m=+120.879443815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:51 crc kubenswrapper[4962]: W1003 12:51:51.975891 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bd2edae_b62c_44cf_8973_bb8cf3c8ee7b.slice/crio-8c57933e8c43f7532e889f21b724eb8399e7add6e2576ab4c32a4853d1385b97 WatchSource:0}: Error finding container 8c57933e8c43f7532e889f21b724eb8399e7add6e2576ab4c32a4853d1385b97: Status 404 returned error can't find the container with id 8c57933e8c43f7532e889f21b724eb8399e7add6e2576ab4c32a4853d1385b97 Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.034576 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.076974 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.077291 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.577279412 +0000 UTC m=+120.981177247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.147752 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.177577 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.177724 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.677702568 +0000 UTC m=+121.081600393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.177838 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.178269 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.678254423 +0000 UTC m=+121.082152258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.210207 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q"] Oct 03 12:51:52 crc kubenswrapper[4962]: W1003 12:51:52.218363 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec276030_49bd_4751_93f2_456157bd157d.slice/crio-a971100fe8f4d80319c74b5d1581d34adf9d5db0e1603ed2f01d121ac6972fde WatchSource:0}: Error finding container a971100fe8f4d80319c74b5d1581d34adf9d5db0e1603ed2f01d121ac6972fde: Status 404 returned error can't find the container with id a971100fe8f4d80319c74b5d1581d34adf9d5db0e1603ed2f01d121ac6972fde Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.283589 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.284531 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.784298584 +0000 UTC m=+121.188196419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.300063 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" event={"ID":"a478f682-ff2f-4920-b535-24b1675ce2c7","Type":"ContainerStarted","Data":"996d923191875e313654462af3f4d3e943018812f9bf17c9e76404af30d17251"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.300525 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v8m25"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.300573 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.377560 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-w5ndz" event={"ID":"3b901edd-ff16-4b32-a22d-5e8feda7f19b","Type":"ContainerStarted","Data":"4fdb58bc9b579bb0ca2335a10df99489c43e30f98541a31833eae6a80952b0d4"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.385613 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.385971 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.885955294 +0000 UTC m=+121.289853129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.392507 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" event={"ID":"a156df7d-5cb7-4d30-b183-90c66b7f9009","Type":"ContainerStarted","Data":"b73160dba2f924f8f773f0f5f64f9c3eb48d9daba5fbdecbf2b55c8259cd31d6"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.398782 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" event={"ID":"ff0b79bc-bc68-4ce3-a3ec-bcd3e6b5b9fa","Type":"ContainerStarted","Data":"82a0df075a7dd7b5fcbfd6ad2f4d5bc58b6654cafa66921a8f7978a820861da5"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.411710 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" event={"ID":"d9b871c7-1b7d-4b51-ae32-c179956c4de7","Type":"ContainerStarted","Data":"2ec7eaed5b909ad9bdc02cfc4d3a6839ef9fbc31ea7e9f63ffcc1ee36e21a5f6"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.426376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" event={"ID":"df5a2d1b-1d06-4201-8546-1a6f67a2511e","Type":"ContainerStarted","Data":"59e472a14cc35304dd9d459b1c2c03faf8d082f04a24f480ae8cf1d47d84cd89"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.430060 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdw2z"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.442673 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" event={"ID":"0c1ee82c-3d26-42f8-b083-d73f7e25448f","Type":"ContainerStarted","Data":"74357884e2507773f4adc75fdddb2c2e7127c672e0c20bc689e487870a4cd6a8"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.450785 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" event={"ID":"ec276030-49bd-4751-93f2-456157bd157d","Type":"ContainerStarted","Data":"a971100fe8f4d80319c74b5d1581d34adf9d5db0e1603ed2f01d121ac6972fde"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.451860 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" event={"ID":"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b","Type":"ContainerStarted","Data":"8c57933e8c43f7532e889f21b724eb8399e7add6e2576ab4c32a4853d1385b97"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.453077 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x4nlh" event={"ID":"693d9472-5b36-4df8-a6fb-d3e3aece0cdb","Type":"ContainerStarted","Data":"d7cc57ca0a236e6176dc9c36adc494209388edd036595cfd4615ec00a47917c7"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.453447 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5hcj2"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.477698 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vnd7f"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.481384 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.485598 4962 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-q777j container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.485648 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" podUID="3e3f372c-8948-4d84-aee2-441d77e3201a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.486533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" event={"ID":"d6b25ec8-0fa2-4d8e-81e3-51e15eee578a","Type":"ContainerStarted","Data":"4e5bed21d3456beacb585e0ced3d34fa14961b1eaf591f3edf1119e3a461e477"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.490016 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.491323 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.491577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" event={"ID":"91a19e46-bca3-43f1-a0f6-0d8805a405db","Type":"ContainerStarted","Data":"f967164a55cb3b4258ea4f30296a46a7bb25515f528b33de2d10531f17fb8267"} Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.492364 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:52.992347424 +0000 UTC m=+121.396245259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.493152 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.493190 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.501029 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hszb5" event={"ID":"d66f8fa5-4d41-40e1-87aa-e2b08e132cbc","Type":"ContainerStarted","Data":"9b16a11951fd9a82add74b810aa8e2bec664422a84292509e84843f330fd3e89"} Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.501070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.508818 4962 patch_prober.go:28] interesting pod/console-operator-58897d9998-hszb5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.508865 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hszb5" podUID="d66f8fa5-4d41-40e1-87aa-e2b08e132cbc" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.519161 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r2b97"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.580706 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.591991 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.594100 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.094089556 +0000 UTC m=+121.497987391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.601448 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.603969 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7"] Oct 03 12:51:52 crc kubenswrapper[4962]: W1003 12:51:52.661968 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62126d09_dac2_4f55_9bdf_f07437e37b6f.slice/crio-795ee5ba16499f1d2f4d116837601430eeb60cac7cce4b967f319ef24d4381ec WatchSource:0}: Error finding container 795ee5ba16499f1d2f4d116837601430eeb60cac7cce4b967f319ef24d4381ec: Status 404 returned error can't find the container with id 795ee5ba16499f1d2f4d116837601430eeb60cac7cce4b967f319ef24d4381ec Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.663079 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jfpvr"] Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.694930 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.695412 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.195396407 +0000 UTC m=+121.599294242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.718697 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hszb5" podStartSLOduration=99.718673805 podStartE2EDuration="1m39.718673805s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:52.654186586 +0000 UTC m=+121.058084411" watchObservedRunningTime="2025-10-03 12:51:52.718673805 +0000 UTC m=+121.122571650" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.723344 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-w5ndz" podStartSLOduration=6.723325473 podStartE2EDuration="6.723325473s" podCreationTimestamp="2025-10-03 12:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:52.679008677 +0000 UTC m=+121.082906522" watchObservedRunningTime="2025-10-03 12:51:52.723325473 +0000 UTC m=+121.127223308" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.764972 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-sd8km" podStartSLOduration=99.764954516 podStartE2EDuration="1m39.764954516s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:52.706127681 +0000 UTC m=+121.110025526" watchObservedRunningTime="2025-10-03 12:51:52.764954516 +0000 UTC m=+121.168852341" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.780874 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" podStartSLOduration=99.780219535 podStartE2EDuration="1m39.780219535s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:52.768578965 +0000 UTC m=+121.172476790" watchObservedRunningTime="2025-10-03 12:51:52.780219535 +0000 UTC m=+121.184117370" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.796758 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.797187 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.29717472 +0000 UTC m=+121.701072555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.826882 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4z5v6" podStartSLOduration=99.826862835 podStartE2EDuration="1m39.826862835s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:52.823137703 +0000 UTC m=+121.227035538" watchObservedRunningTime="2025-10-03 12:51:52.826862835 +0000 UTC m=+121.230760660" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.856265 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" podStartSLOduration=99.856246451 podStartE2EDuration="1m39.856246451s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:52.853869886 +0000 UTC m=+121.257767711" watchObservedRunningTime="2025-10-03 12:51:52.856246451 +0000 UTC m=+121.260144286" Oct 03 12:51:52 crc kubenswrapper[4962]: I1003 12:51:52.899015 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:52 crc kubenswrapper[4962]: E1003 12:51:52.899409 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.399394726 +0000 UTC m=+121.803292561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.005024 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.005598 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.5055762 +0000 UTC m=+121.909474115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.106369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.106578 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.606550821 +0000 UTC m=+122.010448646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.106726 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.107284 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.607271911 +0000 UTC m=+122.011169736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.209601 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.209747 4962 patch_prober.go:28] interesting pod/apiserver-76f77b778f-s2sdt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]log ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]etcd ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/generic-apiserver-start-informers ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/max-in-flight-filter ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 03 12:51:53 crc kubenswrapper[4962]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 03 12:51:53 crc kubenswrapper[4962]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/project.openshift.io-projectcache ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 03 12:51:53 crc kubenswrapper[4962]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 03 12:51:53 crc kubenswrapper[4962]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 03 12:51:53 crc kubenswrapper[4962]: livez check failed Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.209821 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" podUID="7b83dbab-ab28-4769-b812-be82be9db67e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.210084 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.710069132 +0000 UTC m=+122.113966967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.314520 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.314900 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.814886809 +0000 UTC m=+122.218784644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.415627 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.415803 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.915771958 +0000 UTC m=+122.319669793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.416078 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.416409 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:53.916398075 +0000 UTC m=+122.320295910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.516848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.517287 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.017269454 +0000 UTC m=+122.421167289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.521387 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wlnbb" event={"ID":"afbd04f3-97c3-46d0-8b5d-17c630f20f42","Type":"ContainerStarted","Data":"e55e97eb4772f3b8dc73b6520baedac9d6a926780e203016cee7dbc0f31d9ad6"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.531785 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" event={"ID":"ec276030-49bd-4751-93f2-456157bd157d","Type":"ContainerStarted","Data":"aa893c9554bb09b46cbde552982e72a0781c45a52664111ec99513e45f8c7c50"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.536022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" event={"ID":"a478f682-ff2f-4920-b535-24b1675ce2c7","Type":"ContainerStarted","Data":"f9d7a5e0fa6177245020e0234f9dc8d850b511ad38518ff97edcd25405891500"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.537980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" event={"ID":"e381f38c-5571-4221-8661-2ea67e6e2a52","Type":"ContainerStarted","Data":"e3f3f0c7587c32b2712c80a98e532bd861121c1b4d263c265d9876ded4e78351"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.538017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" event={"ID":"e381f38c-5571-4221-8661-2ea67e6e2a52","Type":"ContainerStarted","Data":"2d5ecc05defdfc3646bdf3723ae841d231c79adc7ad70035417ac492f9371f36"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.542414 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v8m25" event={"ID":"380dd699-f44d-4294-81c0-b26e75d00678","Type":"ContainerStarted","Data":"b42c8b67b448589cc2b9049c2b4ac14e09aa2180508ce87471ad122adb9f54e5"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.542471 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v8m25" event={"ID":"380dd699-f44d-4294-81c0-b26e75d00678","Type":"ContainerStarted","Data":"6e9140d1d469815327c9f7d58d6f95856fafd499b1677a6e4c48a69fe7f1a723"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.543922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" event={"ID":"1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b","Type":"ContainerStarted","Data":"8d472ea861065e60117dd39263483a572e8aaed85bad4d154d3bacd0b5a1fa1d"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.545087 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.548193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" event={"ID":"289b8912-7a01-4dc3-aec2-6082ddbe1698","Type":"ContainerStarted","Data":"bf9ab84d9d5be3ba97e8328eb7c291234e2fb77a46dbb5b986dcfe2bb3abfb02"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.548278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" event={"ID":"289b8912-7a01-4dc3-aec2-6082ddbe1698","Type":"ContainerStarted","Data":"af72d9248b03c142ed83107ba53babe9d3858db732049a60714b6cad9fecc144"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.548624 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.548719 4962 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bfcs7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.548754 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" podUID="1bd2edae-b62c-44cf-8973-bb8cf3c8ee7b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.552786 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wlnbb" podStartSLOduration=100.552770298 podStartE2EDuration="1m40.552770298s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.551773341 +0000 UTC m=+121.955671196" watchObservedRunningTime="2025-10-03 12:51:53.552770298 +0000 UTC m=+121.956668133" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.553564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" event={"ID":"2edf0825-d4bd-4a22-a65f-be54b0502600","Type":"ContainerStarted","Data":"edfb132290584f10822791d9e20441b551042408e68ed7775f6106554770cefa"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.554891 4962 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-826w7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.554944 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" podUID="289b8912-7a01-4dc3-aec2-6082ddbe1698" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.570390 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" podStartSLOduration=99.570373351 podStartE2EDuration="1m39.570373351s" podCreationTimestamp="2025-10-03 12:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.568564701 +0000 UTC m=+121.972462546" watchObservedRunningTime="2025-10-03 12:51:53.570373351 +0000 UTC m=+121.974271186" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.579161 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" event={"ID":"994f46cf-ed06-420d-a2fd-52547aadd0ce","Type":"ContainerStarted","Data":"511d2609ba9863f15aa517b28ef7c28a673796a78144e598d1b21cd7b3bdf574"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.619270 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" event={"ID":"3e3f372c-8948-4d84-aee2-441d77e3201a","Type":"ContainerStarted","Data":"506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.619592 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.620167 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pbd2f" podStartSLOduration=100.620154597 podStartE2EDuration="1m40.620154597s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.588512319 +0000 UTC m=+121.992410154" watchObservedRunningTime="2025-10-03 12:51:53.620154597 +0000 UTC m=+122.024052432" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.620324 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" podStartSLOduration=100.620319542 podStartE2EDuration="1m40.620319542s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.61843606 +0000 UTC m=+122.022333905" watchObservedRunningTime="2025-10-03 12:51:53.620319542 +0000 UTC m=+122.024217377" Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.620948 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.120932339 +0000 UTC m=+122.524830174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.642586 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" event={"ID":"003637f9-ebbe-4587-bdf0-071bfca642dd","Type":"ContainerStarted","Data":"c1a7081d62d83edb879e37f94194187133b719048f2dbfca1954a6d2388de194"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.643853 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jfpvr" podStartSLOduration=99.643832687 podStartE2EDuration="1m39.643832687s" podCreationTimestamp="2025-10-03 12:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.642150371 +0000 UTC m=+122.046048206" watchObservedRunningTime="2025-10-03 12:51:53.643832687 +0000 UTC m=+122.047730522" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.667101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" event={"ID":"2714766b-33f3-4280-80c2-3e3c8cead5cd","Type":"ContainerStarted","Data":"a24ee41ff667ca8abbb308aae85b4853308207011a493bd0f0ebb8edd97be8e0"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.667156 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" event={"ID":"2714766b-33f3-4280-80c2-3e3c8cead5cd","Type":"ContainerStarted","Data":"b4119413c2bb7161d7ffbe9104ad50b5111c2093d7c62e0658796902c782d365"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.675879 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" event={"ID":"0c1ee82c-3d26-42f8-b083-d73f7e25448f","Type":"ContainerStarted","Data":"f83eb1a60695dfc5312e22218e42b57b310b035523d195d1eab97fb0ae02b62e"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.675941 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" event={"ID":"0c1ee82c-3d26-42f8-b083-d73f7e25448f","Type":"ContainerStarted","Data":"183266a915b41c6c6f1a311e32cbd45b31ef44bf056992a1388b32f207c2b568"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.686546 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" event={"ID":"f64523a2-c2b2-4f8d-9a51-74a3a64f8ca4","Type":"ContainerStarted","Data":"688bcca0792f5fe94e4b1eb78df41145eaaa69ecbc094a5c2eab92d8ff07c63d"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.693984 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" event={"ID":"5e63df1c-39c0-4610-87dd-9772a75ddac9","Type":"ContainerStarted","Data":"fecd16b09c2c6b193e879b5b3d6fde79787a17d81d631ee68d9d95859c8b011f"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.694035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" event={"ID":"5e63df1c-39c0-4610-87dd-9772a75ddac9","Type":"ContainerStarted","Data":"4cd6a746d3d06f371adae957332d885891fb1aee8c38486a6e4a94f0caffc48e"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.702916 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" podStartSLOduration=99.702898368 podStartE2EDuration="1m39.702898368s" podCreationTimestamp="2025-10-03 12:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.701592152 +0000 UTC m=+122.105489997" watchObservedRunningTime="2025-10-03 12:51:53.702898368 +0000 UTC m=+122.106796223" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.704251 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" podStartSLOduration=100.704241735 podStartE2EDuration="1m40.704241735s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.676854574 +0000 UTC m=+122.080752409" watchObservedRunningTime="2025-10-03 12:51:53.704241735 +0000 UTC m=+122.108139570" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.709742 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" event={"ID":"cb5bb7ba-79c7-4251-8025-68e5c9997447","Type":"ContainerStarted","Data":"363465b4562fc3c04e8a0d9e73af449ae5672e4ec239659e98559860970a81f7"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.709799 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" event={"ID":"cb5bb7ba-79c7-4251-8025-68e5c9997447","Type":"ContainerStarted","Data":"396a3c9af848cb4a60f82bf80db8bfc3c704af491f02527a28a54a5bfac35488"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.710749 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.716521 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" event={"ID":"62126d09-dac2-4f55-9bdf-f07437e37b6f","Type":"ContainerStarted","Data":"5935a58d9d64916665760e299a7781e09f0319fe520f1bc5262f60e9ddf8d143"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.716595 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" event={"ID":"62126d09-dac2-4f55-9bdf-f07437e37b6f","Type":"ContainerStarted","Data":"795ee5ba16499f1d2f4d116837601430eeb60cac7cce4b967f319ef24d4381ec"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.718499 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wdw2z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.718563 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" podUID="cb5bb7ba-79c7-4251-8025-68e5c9997447" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.723421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.724795 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.224778679 +0000 UTC m=+122.628676514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.730307 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8brbj" podStartSLOduration=100.73028364 podStartE2EDuration="1m40.73028364s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.726168307 +0000 UTC m=+122.130066152" watchObservedRunningTime="2025-10-03 12:51:53.73028364 +0000 UTC m=+122.134181475" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.739172 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.740325 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.740751 4962 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-xrdnq container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.740827 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" podUID="994f46cf-ed06-420d-a2fd-52547aadd0ce" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.744241 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" event={"ID":"0d5189b2-b3f9-464a-b267-6e70a2687f99","Type":"ContainerStarted","Data":"7b8a3ba60f9d97eeae4d299e23d7510f73fa85f0e96deca5c241c58e372ebb35"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.764878 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xxhzg" podStartSLOduration=100.764854339 podStartE2EDuration="1m40.764854339s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.755207744 +0000 UTC m=+122.159105599" watchObservedRunningTime="2025-10-03 12:51:53.764854339 +0000 UTC m=+122.168752184" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.767407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" event={"ID":"8628d57e-47fc-4269-b96f-7e04ebfd320d","Type":"ContainerStarted","Data":"10e9a4f6e5e14112f4252f5cb77bb0c986e025401da155084c137f2d4b3d1813"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.767459 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" event={"ID":"8628d57e-47fc-4269-b96f-7e04ebfd320d","Type":"ContainerStarted","Data":"46fa339557b28add3c2aa8bd6862387c75c856370b045d6f0510386379f97477"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.789433 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-hcwtn" podStartSLOduration=100.789406673 podStartE2EDuration="1m40.789406673s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.781769083 +0000 UTC m=+122.185666948" watchObservedRunningTime="2025-10-03 12:51:53.789406673 +0000 UTC m=+122.193304508" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.798053 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" event={"ID":"51b4dfee-5b09-40cb-a284-3d1d16c03cd3","Type":"ContainerStarted","Data":"e38d624cd011e671b67d54452314fb2fbbf72f271aa656b9750940ef4248416e"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.798110 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" event={"ID":"51b4dfee-5b09-40cb-a284-3d1d16c03cd3","Type":"ContainerStarted","Data":"fb754a6f78036d8e5bc0f651c9d812c11d5145bd3e7c72917e50a5a2423d910d"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.811835 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" event={"ID":"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf","Type":"ContainerStarted","Data":"306a9344193aeebb1e5c624200fecd40f8a367f445e29743ea264d7817ad7b0a"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.829589 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.835023 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.335006724 +0000 UTC m=+122.738904559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.836248 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" event={"ID":"ed078080-50cf-44fb-ade6-16575106862c","Type":"ContainerStarted","Data":"9107a8687693adff3a99475cd7a3cf7df5b2949e811231bbfdf2f4a15197a28d"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.836285 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" event={"ID":"ed078080-50cf-44fb-ade6-16575106862c","Type":"ContainerStarted","Data":"c5fe818a62ff5c3e6a94596b1a02829211c3b142af37216ed5a7d2b9ae84fbbd"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.836852 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.857522 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r2b97" event={"ID":"44f09fd4-533a-4ed1-b31a-be8f976b2855","Type":"ContainerStarted","Data":"7c475f0aff81bbb76bfc384edbf8635992edbdde6e24854fedb440e619a63b0a"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.857567 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r2b97" event={"ID":"44f09fd4-533a-4ed1-b31a-be8f976b2855","Type":"ContainerStarted","Data":"c77f242012942ed90b32138695a6d79d8d3e605fe8ca7da1e2622dfdf192a911"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.861649 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" event={"ID":"398018f7-8c31-40f9-bd6a-170564176a58","Type":"ContainerStarted","Data":"42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.862209 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.871692 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b2vsf" podStartSLOduration=99.87167208 podStartE2EDuration="1m39.87167208s" podCreationTimestamp="2025-10-03 12:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.871354312 +0000 UTC m=+122.275252157" watchObservedRunningTime="2025-10-03 12:51:53.87167208 +0000 UTC m=+122.275569935" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.874339 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r628t" podStartSLOduration=100.874325073 podStartE2EDuration="1m40.874325073s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.822349037 +0000 UTC m=+122.226246882" watchObservedRunningTime="2025-10-03 12:51:53.874325073 +0000 UTC m=+122.278222908" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.906392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" event={"ID":"d9b871c7-1b7d-4b51-ae32-c179956c4de7","Type":"ContainerStarted","Data":"c08d4e5adf4e6b4e4cd7433eb0621db05e20c60015b5247d7c7231198a5c8bd7"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.909058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" event={"ID":"a156df7d-5cb7-4d30-b183-90c66b7f9009","Type":"ContainerStarted","Data":"3625a820bb24f1a602c82c67a9d2c529c37363530c453db570956859e0c9b49d"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.909715 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.916145 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbkn2" podStartSLOduration=100.91612923 podStartE2EDuration="1m40.91612923s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.910772653 +0000 UTC m=+122.314670498" watchObservedRunningTime="2025-10-03 12:51:53.91612923 +0000 UTC m=+122.320027065" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.926179 4962 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hdljv container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.926240 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" podUID="a156df7d-5cb7-4d30-b183-90c66b7f9009" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.930158 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" event={"ID":"91a19e46-bca3-43f1-a0f6-0d8805a405db","Type":"ContainerStarted","Data":"181b816812739a6837abf2bbbe1b012a1f5ade9015d2d6cd0a7a25a10e69fbbc"} Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.931223 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:53 crc kubenswrapper[4962]: E1003 12:51:53.933182 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.433167938 +0000 UTC m=+122.837065773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.995032 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gjzdl" podStartSLOduration=100.995018196 podStartE2EDuration="1m40.995018196s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.991510539 +0000 UTC m=+122.395408374" watchObservedRunningTime="2025-10-03 12:51:53.995018196 +0000 UTC m=+122.398916031" Oct 03 12:51:53 crc kubenswrapper[4962]: I1003 12:51:53.997701 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" podStartSLOduration=100.997686719 podStartE2EDuration="1m40.997686719s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:53.957912227 +0000 UTC m=+122.361810062" watchObservedRunningTime="2025-10-03 12:51:53.997686719 +0000 UTC m=+122.401584554" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.037316 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.047749 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.539327252 +0000 UTC m=+122.943225087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.055561 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" podStartSLOduration=101.055543177 podStartE2EDuration="1m41.055543177s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.020909516 +0000 UTC m=+122.424807351" watchObservedRunningTime="2025-10-03 12:51:54.055543177 +0000 UTC m=+122.459441012" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.056831 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" podStartSLOduration=100.056826192 podStartE2EDuration="1m40.056826192s" podCreationTimestamp="2025-10-03 12:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.055146906 +0000 UTC m=+122.459044741" watchObservedRunningTime="2025-10-03 12:51:54.056826192 +0000 UTC m=+122.460724027" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.082961 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llhkr" podStartSLOduration=101.082940769 podStartE2EDuration="1m41.082940769s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.081380206 +0000 UTC m=+122.485278041" watchObservedRunningTime="2025-10-03 12:51:54.082940769 +0000 UTC m=+122.486838604" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.139374 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.139749 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.639721327 +0000 UTC m=+123.043619162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.139920 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.140290 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.640277532 +0000 UTC m=+123.044175367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.146951 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" podStartSLOduration=100.146931835 podStartE2EDuration="1m40.146931835s" podCreationTimestamp="2025-10-03 12:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.113014664 +0000 UTC m=+122.516912499" watchObservedRunningTime="2025-10-03 12:51:54.146931835 +0000 UTC m=+122.550829670" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.148185 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r2b97" podStartSLOduration=8.148180109 podStartE2EDuration="8.148180109s" podCreationTimestamp="2025-10-03 12:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.14638428 +0000 UTC m=+122.550282115" watchObservedRunningTime="2025-10-03 12:51:54.148180109 +0000 UTC m=+122.552077944" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.182736 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" podStartSLOduration=100.182721497 podStartE2EDuration="1m40.182721497s" podCreationTimestamp="2025-10-03 12:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.182288845 +0000 UTC m=+122.586186680" watchObservedRunningTime="2025-10-03 12:51:54.182721497 +0000 UTC m=+122.586619332" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.213399 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5vx4v" podStartSLOduration=101.213384779 podStartE2EDuration="1m41.213384779s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.211405855 +0000 UTC m=+122.615303700" watchObservedRunningTime="2025-10-03 12:51:54.213384779 +0000 UTC m=+122.617282614" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.232407 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ct99q" podStartSLOduration=101.232390141 podStartE2EDuration="1m41.232390141s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.230899 +0000 UTC m=+122.634796835" watchObservedRunningTime="2025-10-03 12:51:54.232390141 +0000 UTC m=+122.636287966" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.247313 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.247828 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.747807974 +0000 UTC m=+123.151705809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.292773 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hszb5" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.349290 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.349584 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.849571817 +0000 UTC m=+123.253469652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.449801 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.449985 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.949960262 +0000 UTC m=+123.353858107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.450166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.450508 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:54.950497826 +0000 UTC m=+123.354395711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.450568 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.455752 4962 patch_prober.go:28] interesting pod/router-default-5444994796-wlnbb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 12:51:54 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Oct 03 12:51:54 crc kubenswrapper[4962]: [+]process-running ok Oct 03 12:51:54 crc kubenswrapper[4962]: healthz check failed Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.455803 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wlnbb" podUID="afbd04f3-97c3-46d0-8b5d-17c630f20f42" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.480339 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.551541 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.551740 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.051710314 +0000 UTC m=+123.455608149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.552125 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.552437 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.052429184 +0000 UTC m=+123.456327019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.587206 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.653251 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.653651 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.153621291 +0000 UTC m=+123.557519126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.755064 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.755480 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.255465996 +0000 UTC m=+123.659363831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.856176 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.856324 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.356302444 +0000 UTC m=+123.760200289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.856455 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.856765 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.356757296 +0000 UTC m=+123.760655131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.936992 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wbchr" event={"ID":"8628d57e-47fc-4269-b96f-7e04ebfd320d","Type":"ContainerStarted","Data":"83751e14fc5dbaf0195f6150df77051fbe468287abafc75c0579412dbb0c5644"} Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.938571 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v8m25" event={"ID":"380dd699-f44d-4294-81c0-b26e75d00678","Type":"ContainerStarted","Data":"b24f111490c72418fe37bfe89d57caad981b41bf93de39a920a0a5d766171ffc"} Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.938758 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-v8m25" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.939932 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" event={"ID":"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf","Type":"ContainerStarted","Data":"4f236467c383108ba97f4a4b480e2d1f538d1ae2f68eead0a611e21a6a09bec4"} Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.939977 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" event={"ID":"c23b80bd-b6e3-4d28-814f-bf8ba17b9bbf","Type":"ContainerStarted","Data":"d67d969ebf7421930ba89c4dc802befeca290bf0905712e67e4efa032a2a28a6"} Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.943120 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" event={"ID":"ed078080-50cf-44fb-ade6-16575106862c","Type":"ContainerStarted","Data":"6f8e55e22ea473267879086c247b645083ab5807623c1de4fc0188f4c09e58c4"} Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.946406 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" event={"ID":"2edf0825-d4bd-4a22-a65f-be54b0502600","Type":"ContainerStarted","Data":"57ab895b54c67a9e1cfe41270275489b2b98217efd29f242989b850a91687396"} Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.949748 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wdw2z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.949793 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" podUID="cb5bb7ba-79c7-4251-8025-68e5c9997447" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.953015 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bfcs7" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.957462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:54 crc kubenswrapper[4962]: E1003 12:51:54.957865 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.45784692 +0000 UTC m=+123.861744755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.958184 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-826w7" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.977709 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v8m25" podStartSLOduration=8.977690124 podStartE2EDuration="8.977690124s" podCreationTimestamp="2025-10-03 12:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.975031772 +0000 UTC m=+123.378929607" watchObservedRunningTime="2025-10-03 12:51:54.977690124 +0000 UTC m=+123.381587959" Oct 03 12:51:54 crc kubenswrapper[4962]: I1003 12:51:54.992787 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vnd7f" podStartSLOduration=101.992763468 podStartE2EDuration="1m41.992763468s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:54.992737987 +0000 UTC m=+123.396635832" watchObservedRunningTime="2025-10-03 12:51:54.992763468 +0000 UTC m=+123.396661303" Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.059223 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.061476 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.561458353 +0000 UTC m=+123.965356278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.160926 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.161194 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.661101328 +0000 UTC m=+124.064999163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.161402 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.161715 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.661703685 +0000 UTC m=+124.065601520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.262462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.262720 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.762685746 +0000 UTC m=+124.166583581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.262928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.263318 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.763302733 +0000 UTC m=+124.167200578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.364325 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.364476 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.864450069 +0000 UTC m=+124.268347904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.364605 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.364977 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.864961143 +0000 UTC m=+124.268859058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.453747 4962 patch_prober.go:28] interesting pod/router-default-5444994796-wlnbb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 12:51:55 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Oct 03 12:51:55 crc kubenswrapper[4962]: [+]process-running ok Oct 03 12:51:55 crc kubenswrapper[4962]: healthz check failed Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.453842 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wlnbb" podUID="afbd04f3-97c3-46d0-8b5d-17c630f20f42" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.465946 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.466454 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:55.966434598 +0000 UTC m=+124.370332433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.567425 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.567845 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.067831241 +0000 UTC m=+124.471729076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.668566 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.668787 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.168762151 +0000 UTC m=+124.572659986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.673840 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hdljv" Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.770699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.771030 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.271018028 +0000 UTC m=+124.674915863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.872047 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.872218 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.372193654 +0000 UTC m=+124.776091489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.872311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.872588 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.372577145 +0000 UTC m=+124.776474980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.951147 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wdw2z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.951186 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" podUID="cb5bb7ba-79c7-4251-8025-68e5c9997447" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.973356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.973538 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.473511865 +0000 UTC m=+124.877409690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.973663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:55 crc kubenswrapper[4962]: E1003 12:51:55.973986 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.473976968 +0000 UTC m=+124.877874803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:55 crc kubenswrapper[4962]: I1003 12:51:55.980024 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v9wqz"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.004424 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.015521 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v9wqz"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.015874 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.071977 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.073182 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.074363 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.074843 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-utilities\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.074970 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6pv\" (UniqueName: \"kubernetes.io/projected/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-kube-api-access-nj6pv\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.075015 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.57499365 +0000 UTC m=+124.978891485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.075270 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-catalog-content\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.080190 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.080191 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.093044 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.173216 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dg2fc"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.174351 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.176070 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.176452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6pv\" (UniqueName: \"kubernetes.io/projected/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-kube-api-access-nj6pv\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.176507 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-catalog-content\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.176547 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb8d277c-2383-4000-b828-1f3d43dc660b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cb8d277c-2383-4000-b828-1f3d43dc660b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.176565 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb8d277c-2383-4000-b828-1f3d43dc660b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cb8d277c-2383-4000-b828-1f3d43dc660b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.176616 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.176666 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-utilities\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.177071 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-utilities\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.177323 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.677308408 +0000 UTC m=+125.081206253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.179486 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-catalog-content\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.186624 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dg2fc"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.206898 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6pv\" (UniqueName: \"kubernetes.io/projected/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-kube-api-access-nj6pv\") pod \"certified-operators-v9wqz\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.278178 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.278402 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-utilities\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.278462 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrl2n\" (UniqueName: \"kubernetes.io/projected/fdb2f075-4c59-41c9-b77e-550905415bdb-kube-api-access-rrl2n\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.278511 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb8d277c-2383-4000-b828-1f3d43dc660b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cb8d277c-2383-4000-b828-1f3d43dc660b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.278537 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb8d277c-2383-4000-b828-1f3d43dc660b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cb8d277c-2383-4000-b828-1f3d43dc660b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.278590 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-catalog-content\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.278773 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.778751012 +0000 UTC m=+125.182648847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.278835 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb8d277c-2383-4000-b828-1f3d43dc660b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cb8d277c-2383-4000-b828-1f3d43dc660b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.307202 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb8d277c-2383-4000-b828-1f3d43dc660b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cb8d277c-2383-4000-b828-1f3d43dc660b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.331798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.377580 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5k7x6"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.378828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.379712 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-catalog-content\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.379760 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.379841 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-utilities\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.379888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrl2n\" (UniqueName: \"kubernetes.io/projected/fdb2f075-4c59-41c9-b77e-550905415bdb-kube-api-access-rrl2n\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.380534 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.880518745 +0000 UTC m=+125.284416580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.380845 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-catalog-content\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.380930 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-utilities\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.393579 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.396165 4962 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.396792 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5k7x6"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.403094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrl2n\" (UniqueName: \"kubernetes.io/projected/fdb2f075-4c59-41c9-b77e-550905415bdb-kube-api-access-rrl2n\") pod \"community-operators-dg2fc\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.456856 4962 patch_prober.go:28] interesting pod/router-default-5444994796-wlnbb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 12:51:56 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Oct 03 12:51:56 crc kubenswrapper[4962]: [+]process-running ok Oct 03 12:51:56 crc kubenswrapper[4962]: healthz check failed Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.456926 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wlnbb" podUID="afbd04f3-97c3-46d0-8b5d-17c630f20f42" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.480428 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.480596 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.980575912 +0000 UTC m=+125.384473747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.480703 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.480768 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-catalog-content\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.480842 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-utilities\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.480864 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdp4c\" (UniqueName: \"kubernetes.io/projected/535af756-29c8-4753-bf86-34b327119a7d-kube-api-access-bdp4c\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.481115 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:56.981104836 +0000 UTC m=+125.385002771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.487836 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.580418 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sr86x"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.581541 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.582252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.582450 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-catalog-content\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.582506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-utilities\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.582523 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdp4c\" (UniqueName: \"kubernetes.io/projected/535af756-29c8-4753-bf86-34b327119a7d-kube-api-access-bdp4c\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.582872 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.082857699 +0000 UTC m=+125.486755534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.583532 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-catalog-content\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.583756 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-utilities\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.592335 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sr86x"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.637541 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdp4c\" (UniqueName: \"kubernetes.io/projected/535af756-29c8-4753-bf86-34b327119a7d-kube-api-access-bdp4c\") pod \"certified-operators-5k7x6\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.683521 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-utilities\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.683565 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-catalog-content\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.683708 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.683744 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smrmr\" (UniqueName: \"kubernetes.io/projected/74bc700b-741d-4d4c-8758-9c259caa9f4b-kube-api-access-smrmr\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.684093 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.184075857 +0000 UTC m=+125.587973692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.693916 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.793250 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.793465 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smrmr\" (UniqueName: \"kubernetes.io/projected/74bc700b-741d-4d4c-8758-9c259caa9f4b-kube-api-access-smrmr\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.793514 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.293486739 +0000 UTC m=+125.697384574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.793550 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-utilities\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.793605 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-catalog-content\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.794069 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-catalog-content\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.794169 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-utilities\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.814998 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.837102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smrmr\" (UniqueName: \"kubernetes.io/projected/74bc700b-741d-4d4c-8758-9c259caa9f4b-kube-api-access-smrmr\") pod \"community-operators-sr86x\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.905961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:56 crc kubenswrapper[4962]: E1003 12:51:56.906740 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.406723557 +0000 UTC m=+125.810621392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.907015 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v9wqz"] Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.952516 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.975742 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" event={"ID":"2edf0825-d4bd-4a22-a65f-be54b0502600","Type":"ContainerStarted","Data":"5af2da8a9f6465b689e36e894ee6578cffbebfc8e79952150a6d8c278481b3ac"} Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.975791 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" event={"ID":"2edf0825-d4bd-4a22-a65f-be54b0502600","Type":"ContainerStarted","Data":"f0ef1869ca0ef32892700e8b659f38013fa42bd58d9a561cd52e04957a5ea3f4"} Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.975804 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" event={"ID":"2edf0825-d4bd-4a22-a65f-be54b0502600","Type":"ContainerStarted","Data":"22c35c964964673b65032b2d443e37f4ed22bd51de1c1637dc19f11ec9f4615d"} Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.977088 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cb8d277c-2383-4000-b828-1f3d43dc660b","Type":"ContainerStarted","Data":"edf0d94c255f2afc7721e086cd1e6d817fbfb770e25cc9488264ddc27e49cfd4"} Oct 03 12:51:56 crc kubenswrapper[4962]: I1003 12:51:56.979184 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9wqz" event={"ID":"f791e711-ff4d-47b5-aa50-efbf71dc1ac2","Type":"ContainerStarted","Data":"f47cfbc0b5455defa626748b5c5c8a86fe98ce950dae0e4d6fe76bd6010660ec"} Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.005112 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5hcj2" podStartSLOduration=11.005091367 podStartE2EDuration="11.005091367s" podCreationTimestamp="2025-10-03 12:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:57.001741305 +0000 UTC m=+125.405639140" watchObservedRunningTime="2025-10-03 12:51:57.005091367 +0000 UTC m=+125.408989202" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.007540 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:57 crc kubenswrapper[4962]: E1003 12:51:57.008101 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.508072849 +0000 UTC m=+125.911970704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.110531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:57 crc kubenswrapper[4962]: E1003 12:51:57.112604 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.612587907 +0000 UTC m=+126.016485742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.167801 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5k7x6"] Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.212743 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:57 crc kubenswrapper[4962]: E1003 12:51:57.213002 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.712983123 +0000 UTC m=+126.116880958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.216025 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:57 crc kubenswrapper[4962]: E1003 12:51:57.216386 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.716370496 +0000 UTC m=+126.120268331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.243042 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dg2fc"] Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.254875 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sr86x"] Oct 03 12:51:57 crc kubenswrapper[4962]: W1003 12:51:57.256443 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdb2f075_4c59_41c9_b77e_550905415bdb.slice/crio-f4509c3499dcec2d71b57b45d371e8462980948bd4237bc41403da692903ea0b WatchSource:0}: Error finding container f4509c3499dcec2d71b57b45d371e8462980948bd4237bc41403da692903ea0b: Status 404 returned error can't find the container with id f4509c3499dcec2d71b57b45d371e8462980948bd4237bc41403da692903ea0b Oct 03 12:51:57 crc kubenswrapper[4962]: W1003 12:51:57.268795 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74bc700b_741d_4d4c_8758_9c259caa9f4b.slice/crio-9a3e713b316e1364cde9cf594a99f6d7fdd167183adc2978108aa566a5df4efe WatchSource:0}: Error finding container 9a3e713b316e1364cde9cf594a99f6d7fdd167183adc2978108aa566a5df4efe: Status 404 returned error can't find the container with id 9a3e713b316e1364cde9cf594a99f6d7fdd167183adc2978108aa566a5df4efe Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.317087 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:57 crc kubenswrapper[4962]: E1003 12:51:57.317262 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.817237354 +0000 UTC m=+126.221135189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.317411 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:57 crc kubenswrapper[4962]: E1003 12:51:57.317757 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 12:51:57.817743268 +0000 UTC m=+126.221641103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6pkx8" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.386700 4962 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T12:51:56.396186935Z","Handler":null,"Name":""} Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.398750 4962 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.398781 4962 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.418104 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.447980 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.453728 4962 patch_prober.go:28] interesting pod/router-default-5444994796-wlnbb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 12:51:57 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Oct 03 12:51:57 crc kubenswrapper[4962]: [+]process-running ok Oct 03 12:51:57 crc kubenswrapper[4962]: healthz check failed Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.454078 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wlnbb" podUID="afbd04f3-97c3-46d0-8b5d-17c630f20f42" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.498059 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.502686 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-s2sdt" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.522124 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.525434 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.525482 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.588190 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6pkx8\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.809683 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t4sln" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.835206 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.844566 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.986095 4962 generic.go:334] "Generic (PLEG): container finished" podID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerID="6dc0bd5ff72311ef6f7be98020eacb1ed594f4edb80edec86eaad2588e7ab4a0" exitCode=0 Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.986240 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9wqz" event={"ID":"f791e711-ff4d-47b5-aa50-efbf71dc1ac2","Type":"ContainerDied","Data":"6dc0bd5ff72311ef6f7be98020eacb1ed594f4edb80edec86eaad2588e7ab4a0"} Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.987759 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.988340 4962 generic.go:334] "Generic (PLEG): container finished" podID="a478f682-ff2f-4920-b535-24b1675ce2c7" containerID="f9d7a5e0fa6177245020e0234f9dc8d850b511ad38518ff97edcd25405891500" exitCode=0 Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.988422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" event={"ID":"a478f682-ff2f-4920-b535-24b1675ce2c7","Type":"ContainerDied","Data":"f9d7a5e0fa6177245020e0234f9dc8d850b511ad38518ff97edcd25405891500"} Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.992350 4962 generic.go:334] "Generic (PLEG): container finished" podID="535af756-29c8-4753-bf86-34b327119a7d" containerID="6f6da6dde528c7b86f204d0d8ffff4c317bd461d5bac3ee09ce8eb5df0c9e8a5" exitCode=0 Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.992403 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5k7x6" event={"ID":"535af756-29c8-4753-bf86-34b327119a7d","Type":"ContainerDied","Data":"6f6da6dde528c7b86f204d0d8ffff4c317bd461d5bac3ee09ce8eb5df0c9e8a5"} Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.992438 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5k7x6" event={"ID":"535af756-29c8-4753-bf86-34b327119a7d","Type":"ContainerStarted","Data":"931acbbc05c4f5e1ec5e4d2fedfc464a08eac0a85dec5132522b9aceae64a22b"} Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.994404 4962 generic.go:334] "Generic (PLEG): container finished" podID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerID="3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7" exitCode=0 Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.994473 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr86x" event={"ID":"74bc700b-741d-4d4c-8758-9c259caa9f4b","Type":"ContainerDied","Data":"3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7"} Oct 03 12:51:57 crc kubenswrapper[4962]: I1003 12:51:57.994500 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr86x" event={"ID":"74bc700b-741d-4d4c-8758-9c259caa9f4b","Type":"ContainerStarted","Data":"9a3e713b316e1364cde9cf594a99f6d7fdd167183adc2978108aa566a5df4efe"} Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.024420 4962 generic.go:334] "Generic (PLEG): container finished" podID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerID="2fb5a3624121d07ac440067b6874ccb3b6b32295f5c1f61745ad205f27cc0639" exitCode=0 Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.024549 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg2fc" event={"ID":"fdb2f075-4c59-41c9-b77e-550905415bdb","Type":"ContainerDied","Data":"2fb5a3624121d07ac440067b6874ccb3b6b32295f5c1f61745ad205f27cc0639"} Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.024584 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg2fc" event={"ID":"fdb2f075-4c59-41c9-b77e-550905415bdb","Type":"ContainerStarted","Data":"f4509c3499dcec2d71b57b45d371e8462980948bd4237bc41403da692903ea0b"} Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.036070 4962 generic.go:334] "Generic (PLEG): container finished" podID="cb8d277c-2383-4000-b828-1f3d43dc660b" containerID="92dc891c075ba345f29506e07d5cb5a91d909889bc25e04b6db3e60b957f52b5" exitCode=0 Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.037042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cb8d277c-2383-4000-b828-1f3d43dc660b","Type":"ContainerDied","Data":"92dc891c075ba345f29506e07d5cb5a91d909889bc25e04b6db3e60b957f52b5"} Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.112188 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6pkx8"] Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.168131 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cgclz"] Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.169701 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.172064 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.178888 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgclz"] Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.230557 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnqh\" (UniqueName: \"kubernetes.io/projected/39aab071-afe0-4e3a-b33c-a758f5e1f673-kube-api-access-zdnqh\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.230602 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-catalog-content\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.230670 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-utilities\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.232945 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.332349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnqh\" (UniqueName: \"kubernetes.io/projected/39aab071-afe0-4e3a-b33c-a758f5e1f673-kube-api-access-zdnqh\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.332401 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-catalog-content\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.332443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-utilities\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.333417 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-catalog-content\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.333436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-utilities\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.353408 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnqh\" (UniqueName: \"kubernetes.io/projected/39aab071-afe0-4e3a-b33c-a758f5e1f673-kube-api-access-zdnqh\") pod \"redhat-marketplace-cgclz\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.454690 4962 patch_prober.go:28] interesting pod/router-default-5444994796-wlnbb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 12:51:58 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Oct 03 12:51:58 crc kubenswrapper[4962]: [+]process-running ok Oct 03 12:51:58 crc kubenswrapper[4962]: healthz check failed Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.455081 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wlnbb" podUID="afbd04f3-97c3-46d0-8b5d-17c630f20f42" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.526447 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.566778 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-gtf6m container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.566818 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gtf6m" podUID="a5971c52-f20d-40a2-9e80-e1c02e83cec0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.567072 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-gtf6m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.567102 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gtf6m" podUID="a5971c52-f20d-40a2-9e80-e1c02e83cec0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.570293 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-csv2p"] Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.571286 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.588851 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-csv2p"] Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.636404 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7s9\" (UniqueName: \"kubernetes.io/projected/a3f948ba-fd31-4599-a860-e2c7deb505f9-kube-api-access-sh7s9\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.636453 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-utilities\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.636488 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-catalog-content\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.737566 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7s9\" (UniqueName: \"kubernetes.io/projected/a3f948ba-fd31-4599-a860-e2c7deb505f9-kube-api-access-sh7s9\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.738028 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-utilities\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.738075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-catalog-content\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.738806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-catalog-content\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.738916 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-utilities\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.745775 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.755179 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrdnq" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.757750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7s9\" (UniqueName: \"kubernetes.io/projected/a3f948ba-fd31-4599-a860-e2c7deb505f9-kube-api-access-sh7s9\") pod \"redhat-marketplace-csv2p\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.767821 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.768098 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.771979 4962 patch_prober.go:28] interesting pod/console-f9d7485db-wgl5v container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.772053 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wgl5v" podUID="796d19ea-1d92-4dcb-9e10-305ddbe1b283" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.784355 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgclz"] Oct 03 12:51:58 crc kubenswrapper[4962]: I1003 12:51:58.889979 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.064548 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgclz" event={"ID":"39aab071-afe0-4e3a-b33c-a758f5e1f673","Type":"ContainerStarted","Data":"0502fa40b6848f345ca906313a192c9bbd54bb706874032f630d5b3ee1281371"} Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.064902 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgclz" event={"ID":"39aab071-afe0-4e3a-b33c-a758f5e1f673","Type":"ContainerStarted","Data":"02ed057ea3114adeb92c6ea43ffb400900e7536fac9935c15696012b7d0aecac"} Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.068899 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" event={"ID":"2982d523-afe6-4ab4-9778-5dbe578a243b","Type":"ContainerStarted","Data":"7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5"} Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.068951 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" event={"ID":"2982d523-afe6-4ab4-9778-5dbe578a243b","Type":"ContainerStarted","Data":"1dcc140df02def664d0e949095981c1e5d957726c6a05a2eccf4a45e70645268"} Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.155867 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" podStartSLOduration=106.155745461 podStartE2EDuration="1m46.155745461s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:51:59.151844394 +0000 UTC m=+127.555742229" watchObservedRunningTime="2025-10-03 12:51:59.155745461 +0000 UTC m=+127.559643296" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.184597 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-96vmq"] Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.185830 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.189966 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.226384 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-96vmq"] Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.248745 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2csgd\" (UniqueName: \"kubernetes.io/projected/965c12da-c517-4aa8-b67e-ddbe916b8578-kube-api-access-2csgd\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.248875 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-utilities\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.248904 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-catalog-content\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.342419 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-csv2p"] Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.350271 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-utilities\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.350328 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-catalog-content\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.350386 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2csgd\" (UniqueName: \"kubernetes.io/projected/965c12da-c517-4aa8-b67e-ddbe916b8578-kube-api-access-2csgd\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.350846 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-utilities\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.351170 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-catalog-content\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.392743 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2csgd\" (UniqueName: \"kubernetes.io/projected/965c12da-c517-4aa8-b67e-ddbe916b8578-kube-api-access-2csgd\") pod \"redhat-operators-96vmq\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.452780 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.459588 4962 patch_prober.go:28] interesting pod/router-default-5444994796-wlnbb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 12:51:59 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Oct 03 12:51:59 crc kubenswrapper[4962]: [+]process-running ok Oct 03 12:51:59 crc kubenswrapper[4962]: healthz check failed Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.459790 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wlnbb" podUID="afbd04f3-97c3-46d0-8b5d-17c630f20f42" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.509978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.562026 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.600918 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tmjqz"] Oct 03 12:51:59 crc kubenswrapper[4962]: E1003 12:51:59.601111 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a478f682-ff2f-4920-b535-24b1675ce2c7" containerName="collect-profiles" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.601123 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a478f682-ff2f-4920-b535-24b1675ce2c7" containerName="collect-profiles" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.601239 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a478f682-ff2f-4920-b535-24b1675ce2c7" containerName="collect-profiles" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.602447 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.612438 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.617866 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmjqz"] Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.660281 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a478f682-ff2f-4920-b535-24b1675ce2c7-config-volume\") pod \"a478f682-ff2f-4920-b535-24b1675ce2c7\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.660338 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb8d277c-2383-4000-b828-1f3d43dc660b-kube-api-access\") pod \"cb8d277c-2383-4000-b828-1f3d43dc660b\" (UID: \"cb8d277c-2383-4000-b828-1f3d43dc660b\") " Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.660392 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb8d277c-2383-4000-b828-1f3d43dc660b-kubelet-dir\") pod \"cb8d277c-2383-4000-b828-1f3d43dc660b\" (UID: \"cb8d277c-2383-4000-b828-1f3d43dc660b\") " Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.660416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a478f682-ff2f-4920-b535-24b1675ce2c7-secret-volume\") pod \"a478f682-ff2f-4920-b535-24b1675ce2c7\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.660436 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdgvh\" (UniqueName: \"kubernetes.io/projected/a478f682-ff2f-4920-b535-24b1675ce2c7-kube-api-access-bdgvh\") pod \"a478f682-ff2f-4920-b535-24b1675ce2c7\" (UID: \"a478f682-ff2f-4920-b535-24b1675ce2c7\") " Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.660598 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-catalog-content\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.660628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvkdg\" (UniqueName: \"kubernetes.io/projected/e2132e89-491f-4189-8c0f-349ace8209b4-kube-api-access-zvkdg\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.660685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-utilities\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.661926 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb8d277c-2383-4000-b828-1f3d43dc660b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cb8d277c-2383-4000-b828-1f3d43dc660b" (UID: "cb8d277c-2383-4000-b828-1f3d43dc660b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.663879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a478f682-ff2f-4920-b535-24b1675ce2c7-config-volume" (OuterVolumeSpecName: "config-volume") pod "a478f682-ff2f-4920-b535-24b1675ce2c7" (UID: "a478f682-ff2f-4920-b535-24b1675ce2c7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.667773 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a478f682-ff2f-4920-b535-24b1675ce2c7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a478f682-ff2f-4920-b535-24b1675ce2c7" (UID: "a478f682-ff2f-4920-b535-24b1675ce2c7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.668626 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb8d277c-2383-4000-b828-1f3d43dc660b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cb8d277c-2383-4000-b828-1f3d43dc660b" (UID: "cb8d277c-2383-4000-b828-1f3d43dc660b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.677701 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a478f682-ff2f-4920-b535-24b1675ce2c7-kube-api-access-bdgvh" (OuterVolumeSpecName: "kube-api-access-bdgvh") pod "a478f682-ff2f-4920-b535-24b1675ce2c7" (UID: "a478f682-ff2f-4920-b535-24b1675ce2c7"). InnerVolumeSpecName "kube-api-access-bdgvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.762337 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-catalog-content\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.763072 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvkdg\" (UniqueName: \"kubernetes.io/projected/e2132e89-491f-4189-8c0f-349ace8209b4-kube-api-access-zvkdg\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.766133 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-catalog-content\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.766994 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-utilities\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.767469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-utilities\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.767495 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a478f682-ff2f-4920-b535-24b1675ce2c7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.767742 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb8d277c-2383-4000-b828-1f3d43dc660b-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.767756 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb8d277c-2383-4000-b828-1f3d43dc660b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.767768 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdgvh\" (UniqueName: \"kubernetes.io/projected/a478f682-ff2f-4920-b535-24b1675ce2c7-kube-api-access-bdgvh\") on node \"crc\" DevicePath \"\"" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.767778 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a478f682-ff2f-4920-b535-24b1675ce2c7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.799888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvkdg\" (UniqueName: \"kubernetes.io/projected/e2132e89-491f-4189-8c0f-349ace8209b4-kube-api-access-zvkdg\") pod \"redhat-operators-tmjqz\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.859574 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.951957 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:51:59 crc kubenswrapper[4962]: I1003 12:51:59.967919 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-96vmq"] Oct 03 12:52:00 crc kubenswrapper[4962]: W1003 12:52:00.032488 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965c12da_c517_4aa8_b67e_ddbe916b8578.slice/crio-ee163d22a67ce5695142055c11e898228a0f09fc32b062f202cd265995a56cfc WatchSource:0}: Error finding container ee163d22a67ce5695142055c11e898228a0f09fc32b062f202cd265995a56cfc: Status 404 returned error can't find the container with id ee163d22a67ce5695142055c11e898228a0f09fc32b062f202cd265995a56cfc Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.090347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cb8d277c-2383-4000-b828-1f3d43dc660b","Type":"ContainerDied","Data":"edf0d94c255f2afc7721e086cd1e6d817fbfb770e25cc9488264ddc27e49cfd4"} Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.090989 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edf0d94c255f2afc7721e086cd1e6d817fbfb770e25cc9488264ddc27e49cfd4" Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.091060 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.117953 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerID="b737e562bf7be11530e11ff91579d10a6179ff33ca3a5c9eb90439171f97a0f0" exitCode=0 Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.118079 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csv2p" event={"ID":"a3f948ba-fd31-4599-a860-e2c7deb505f9","Type":"ContainerDied","Data":"b737e562bf7be11530e11ff91579d10a6179ff33ca3a5c9eb90439171f97a0f0"} Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.118116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csv2p" event={"ID":"a3f948ba-fd31-4599-a860-e2c7deb505f9","Type":"ContainerStarted","Data":"78592bd1da4b8af2fc9bafb43d1555b357d260442215be57e5a01d61b07ed50a"} Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.125219 4962 generic.go:334] "Generic (PLEG): container finished" podID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerID="0502fa40b6848f345ca906313a192c9bbd54bb706874032f630d5b3ee1281371" exitCode=0 Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.125292 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgclz" event={"ID":"39aab071-afe0-4e3a-b33c-a758f5e1f673","Type":"ContainerDied","Data":"0502fa40b6848f345ca906313a192c9bbd54bb706874032f630d5b3ee1281371"} Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.132365 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96vmq" event={"ID":"965c12da-c517-4aa8-b67e-ddbe916b8578","Type":"ContainerStarted","Data":"ee163d22a67ce5695142055c11e898228a0f09fc32b062f202cd265995a56cfc"} Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.149403 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.156296 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2" event={"ID":"a478f682-ff2f-4920-b535-24b1675ce2c7","Type":"ContainerDied","Data":"996d923191875e313654462af3f4d3e943018812f9bf17c9e76404af30d17251"} Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.156377 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="996d923191875e313654462af3f4d3e943018812f9bf17c9e76404af30d17251" Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.156408 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.372359 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmjqz"] Oct 03 12:52:00 crc kubenswrapper[4962]: W1003 12:52:00.386312 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2132e89_491f_4189_8c0f_349ace8209b4.slice/crio-7cb8a3acc6a5beef8126e4b2ac1721d921bc7c7f7c246a6ccd3028b1022eacaa WatchSource:0}: Error finding container 7cb8a3acc6a5beef8126e4b2ac1721d921bc7c7f7c246a6ccd3028b1022eacaa: Status 404 returned error can't find the container with id 7cb8a3acc6a5beef8126e4b2ac1721d921bc7c7f7c246a6ccd3028b1022eacaa Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.458444 4962 patch_prober.go:28] interesting pod/router-default-5444994796-wlnbb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 12:52:00 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Oct 03 12:52:00 crc kubenswrapper[4962]: [+]process-running ok Oct 03 12:52:00 crc kubenswrapper[4962]: healthz check failed Oct 03 12:52:00 crc kubenswrapper[4962]: I1003 12:52:00.458519 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wlnbb" podUID="afbd04f3-97c3-46d0-8b5d-17c630f20f42" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.173785 4962 generic.go:334] "Generic (PLEG): container finished" podID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerID="1a112003ae6ec10c1cf1074b4b425eeb2e424f9c51c5fc3f1988f51a90f1597c" exitCode=0 Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.173863 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96vmq" event={"ID":"965c12da-c517-4aa8-b67e-ddbe916b8578","Type":"ContainerDied","Data":"1a112003ae6ec10c1cf1074b4b425eeb2e424f9c51c5fc3f1988f51a90f1597c"} Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.176256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmjqz" event={"ID":"e2132e89-491f-4189-8c0f-349ace8209b4","Type":"ContainerStarted","Data":"b791e1c9b61331b65e8c2e65b705113d2c826304f939c85e51ea08f603f8eeb4"} Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.176285 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmjqz" event={"ID":"e2132e89-491f-4189-8c0f-349ace8209b4","Type":"ContainerStarted","Data":"7cb8a3acc6a5beef8126e4b2ac1721d921bc7c7f7c246a6ccd3028b1022eacaa"} Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.278154 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 12:52:01 crc kubenswrapper[4962]: E1003 12:52:01.278370 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8d277c-2383-4000-b828-1f3d43dc660b" containerName="pruner" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.278387 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8d277c-2383-4000-b828-1f3d43dc660b" containerName="pruner" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.278501 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb8d277c-2383-4000-b828-1f3d43dc660b" containerName="pruner" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.279083 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.280731 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.281085 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.288903 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.304472 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.304571 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.405708 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.405811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.405908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.448537 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.455570 4962 patch_prober.go:28] interesting pod/router-default-5444994796-wlnbb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 12:52:01 crc kubenswrapper[4962]: [+]has-synced ok Oct 03 12:52:01 crc kubenswrapper[4962]: [+]process-running ok Oct 03 12:52:01 crc kubenswrapper[4962]: healthz check failed Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.455630 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wlnbb" podUID="afbd04f3-97c3-46d0-8b5d-17c630f20f42" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 12:52:01 crc kubenswrapper[4962]: I1003 12:52:01.640712 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:02 crc kubenswrapper[4962]: I1003 12:52:02.177193 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 12:52:02 crc kubenswrapper[4962]: I1003 12:52:02.215219 4962 generic.go:334] "Generic (PLEG): container finished" podID="e2132e89-491f-4189-8c0f-349ace8209b4" containerID="b791e1c9b61331b65e8c2e65b705113d2c826304f939c85e51ea08f603f8eeb4" exitCode=0 Oct 03 12:52:02 crc kubenswrapper[4962]: I1003 12:52:02.216181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmjqz" event={"ID":"e2132e89-491f-4189-8c0f-349ace8209b4","Type":"ContainerDied","Data":"b791e1c9b61331b65e8c2e65b705113d2c826304f939c85e51ea08f603f8eeb4"} Oct 03 12:52:02 crc kubenswrapper[4962]: W1003 12:52:02.223106 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda32d82a0_cd3f_4d2c_a0d5_63ec3c102dcb.slice/crio-ea98663899bbbe7e57e2829d489d470a4f928bebe9fac7d310fc29fb354eaf9a WatchSource:0}: Error finding container ea98663899bbbe7e57e2829d489d470a4f928bebe9fac7d310fc29fb354eaf9a: Status 404 returned error can't find the container with id ea98663899bbbe7e57e2829d489d470a4f928bebe9fac7d310fc29fb354eaf9a Oct 03 12:52:02 crc kubenswrapper[4962]: I1003 12:52:02.455282 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:52:02 crc kubenswrapper[4962]: I1003 12:52:02.470004 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wlnbb" Oct 03 12:52:03 crc kubenswrapper[4962]: I1003 12:52:03.238538 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb","Type":"ContainerStarted","Data":"f309d181b9df822917ade0fdb2767a3da25a8372a2fe4ea7b5a759c7c63092c5"} Oct 03 12:52:03 crc kubenswrapper[4962]: I1003 12:52:03.238594 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb","Type":"ContainerStarted","Data":"ea98663899bbbe7e57e2829d489d470a4f928bebe9fac7d310fc29fb354eaf9a"} Oct 03 12:52:03 crc kubenswrapper[4962]: I1003 12:52:03.267711 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.267688743 podStartE2EDuration="2.267688743s" podCreationTimestamp="2025-10-03 12:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:52:03.255073257 +0000 UTC m=+131.658971112" watchObservedRunningTime="2025-10-03 12:52:03.267688743 +0000 UTC m=+131.671586578" Oct 03 12:52:04 crc kubenswrapper[4962]: I1003 12:52:04.289877 4962 generic.go:334] "Generic (PLEG): container finished" podID="a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb" containerID="f309d181b9df822917ade0fdb2767a3da25a8372a2fe4ea7b5a759c7c63092c5" exitCode=0 Oct 03 12:52:04 crc kubenswrapper[4962]: I1003 12:52:04.289936 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb","Type":"ContainerDied","Data":"f309d181b9df822917ade0fdb2767a3da25a8372a2fe4ea7b5a759c7c63092c5"} Oct 03 12:52:04 crc kubenswrapper[4962]: I1003 12:52:04.651811 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v8m25" Oct 03 12:52:05 crc kubenswrapper[4962]: I1003 12:52:05.634087 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 12:52:08 crc kubenswrapper[4962]: I1003 12:52:08.572504 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gtf6m" Oct 03 12:52:08 crc kubenswrapper[4962]: I1003 12:52:08.774207 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:52:08 crc kubenswrapper[4962]: I1003 12:52:08.779359 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.141827 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.237395 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kubelet-dir\") pod \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\" (UID: \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\") " Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.237510 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kube-api-access\") pod \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\" (UID: \"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb\") " Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.237521 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb" (UID: "a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.237963 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.242056 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb" (UID: "a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.339029 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.349445 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb","Type":"ContainerDied","Data":"ea98663899bbbe7e57e2829d489d470a4f928bebe9fac7d310fc29fb354eaf9a"} Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.349490 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea98663899bbbe7e57e2829d489d470a4f928bebe9fac7d310fc29fb354eaf9a" Oct 03 12:52:13 crc kubenswrapper[4962]: I1003 12:52:13.349551 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 12:52:17 crc kubenswrapper[4962]: I1003 12:52:17.849840 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.016437 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.017816 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.018010 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.018069 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.018899 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.019980 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.020019 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.027911 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.033349 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.034088 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.039928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.048215 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.056376 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.092261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:52:20 crc kubenswrapper[4962]: I1003 12:52:20.360461 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 12:52:24 crc kubenswrapper[4962]: I1003 12:52:24.660128 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:52:24 crc kubenswrapper[4962]: I1003 12:52:24.661282 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:52:29 crc kubenswrapper[4962]: I1003 12:52:29.880225 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5csg7" Oct 03 12:52:33 crc kubenswrapper[4962]: E1003 12:52:33.768166 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 12:52:33 crc kubenswrapper[4962]: E1003 12:52:33.768653 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdp4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5k7x6_openshift-marketplace(535af756-29c8-4753-bf86-34b327119a7d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 12:52:33 crc kubenswrapper[4962]: E1003 12:52:33.769834 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5k7x6" podUID="535af756-29c8-4753-bf86-34b327119a7d" Oct 03 12:52:35 crc kubenswrapper[4962]: I1003 12:52:35.516770 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:52:35 crc kubenswrapper[4962]: I1003 12:52:35.518939 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 12:52:35 crc kubenswrapper[4962]: I1003 12:52:35.532969 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2989e38-d4e7-42c9-8959-f87168a4ac14-metrics-certs\") pod \"network-metrics-daemon-5blzz\" (UID: \"f2989e38-d4e7-42c9-8959-f87168a4ac14\") " pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:52:35 crc kubenswrapper[4962]: I1003 12:52:35.643443 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 12:52:35 crc kubenswrapper[4962]: I1003 12:52:35.652352 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5blzz" Oct 03 12:52:37 crc kubenswrapper[4962]: E1003 12:52:37.505264 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5k7x6" podUID="535af756-29c8-4753-bf86-34b327119a7d" Oct 03 12:52:37 crc kubenswrapper[4962]: E1003 12:52:37.683539 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 12:52:37 crc kubenswrapper[4962]: E1003 12:52:37.683715 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvkdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tmjqz_openshift-marketplace(e2132e89-491f-4189-8c0f-349ace8209b4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 12:52:37 crc kubenswrapper[4962]: E1003 12:52:37.684910 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tmjqz" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" Oct 03 12:52:38 crc kubenswrapper[4962]: E1003 12:52:38.526398 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tmjqz" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" Oct 03 12:52:38 crc kubenswrapper[4962]: E1003 12:52:38.620539 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 12:52:38 crc kubenswrapper[4962]: E1003 12:52:38.620982 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nj6pv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-v9wqz_openshift-marketplace(f791e711-ff4d-47b5-aa50-efbf71dc1ac2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 12:52:38 crc kubenswrapper[4962]: E1003 12:52:38.622983 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-v9wqz" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.713914 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-v9wqz" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.781803 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.781997 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-smrmr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sr86x_openshift-marketplace(74bc700b-741d-4d4c-8758-9c259caa9f4b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.785800 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-sr86x" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.808663 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.808820 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrl2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dg2fc_openshift-marketplace(fdb2f075-4c59-41c9-b77e-550905415bdb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.809950 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dg2fc" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.827744 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.827886 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2csgd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-96vmq_openshift-marketplace(965c12da-c517-4aa8-b67e-ddbe916b8578): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 12:52:39 crc kubenswrapper[4962]: E1003 12:52:39.829121 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-96vmq" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" Oct 03 12:52:42 crc kubenswrapper[4962]: E1003 12:52:42.106959 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-sr86x" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" Oct 03 12:52:42 crc kubenswrapper[4962]: E1003 12:52:42.107300 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dg2fc" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" Oct 03 12:52:42 crc kubenswrapper[4962]: E1003 12:52:42.711272 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 12:52:42 crc kubenswrapper[4962]: E1003 12:52:42.711705 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sh7s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-csv2p_openshift-marketplace(a3f948ba-fd31-4599-a860-e2c7deb505f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 12:52:42 crc kubenswrapper[4962]: E1003 12:52:42.713048 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-csv2p" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" Oct 03 12:52:42 crc kubenswrapper[4962]: E1003 12:52:42.745229 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 12:52:42 crc kubenswrapper[4962]: E1003 12:52:42.745479 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdnqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cgclz_openshift-marketplace(39aab071-afe0-4e3a-b33c-a758f5e1f673): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 12:52:42 crc kubenswrapper[4962]: E1003 12:52:42.746783 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-cgclz" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" Oct 03 12:52:42 crc kubenswrapper[4962]: I1003 12:52:42.894789 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5blzz"] Oct 03 12:52:43 crc kubenswrapper[4962]: W1003 12:52:43.149304 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-67b7eb79199548f0de497902c0dbc65c15694e97324defed38dc41742e96ce69 WatchSource:0}: Error finding container 67b7eb79199548f0de497902c0dbc65c15694e97324defed38dc41742e96ce69: Status 404 returned error can't find the container with id 67b7eb79199548f0de497902c0dbc65c15694e97324defed38dc41742e96ce69 Oct 03 12:52:43 crc kubenswrapper[4962]: W1003 12:52:43.152496 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-6d69162dc08769756d377791e28d99e6b60db3aaefe79f9599747818979bd681 WatchSource:0}: Error finding container 6d69162dc08769756d377791e28d99e6b60db3aaefe79f9599747818979bd681: Status 404 returned error can't find the container with id 6d69162dc08769756d377791e28d99e6b60db3aaefe79f9599747818979bd681 Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.480122 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7213d8224643eff8c337cdfa75d36984cd5c956b4cb375d63d206164935728ab"} Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.480533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6d69162dc08769756d377791e28d99e6b60db3aaefe79f9599747818979bd681"} Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.480732 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.481560 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5blzz" event={"ID":"f2989e38-d4e7-42c9-8959-f87168a4ac14","Type":"ContainerStarted","Data":"80ad144f12477612819995de4d68ff04c69d3b73af034b85f16e4e9c1fcde74e"} Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.481590 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5blzz" event={"ID":"f2989e38-d4e7-42c9-8959-f87168a4ac14","Type":"ContainerStarted","Data":"19fe02ed1463bf4a508199aaae8517434dccbae3b3a6bf47916df8be31567599"} Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.481602 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5blzz" event={"ID":"f2989e38-d4e7-42c9-8959-f87168a4ac14","Type":"ContainerStarted","Data":"b4a6648cd3d74cd6ac1c1d3ca96a73caf7440cfaffa6b54525c813ef1c723719"} Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.482697 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f987e2f047968b9f6d325ba5cfe8a9a88d1ad9aab9b6788e67837fd0ca6319cf"} Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.482730 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"67b7eb79199548f0de497902c0dbc65c15694e97324defed38dc41742e96ce69"} Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.483726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bd8a27a8b36c8e3050f6fa2af4ecb3e18c65552c3816defeb16497293c73c292"} Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.483769 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"111e3df9d37360ae02739dc0372f486a11ea08e6e148c895af273246dbed1a6e"} Oct 03 12:52:43 crc kubenswrapper[4962]: E1003 12:52:43.485349 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cgclz" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" Oct 03 12:52:43 crc kubenswrapper[4962]: E1003 12:52:43.485458 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-csv2p" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" Oct 03 12:52:43 crc kubenswrapper[4962]: I1003 12:52:43.528750 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5blzz" podStartSLOduration=150.528728828 podStartE2EDuration="2m30.528728828s" podCreationTimestamp="2025-10-03 12:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:52:43.526618119 +0000 UTC m=+171.930515964" watchObservedRunningTime="2025-10-03 12:52:43.528728828 +0000 UTC m=+171.932626663" Oct 03 12:52:49 crc kubenswrapper[4962]: I1003 12:52:49.512857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5k7x6" event={"ID":"535af756-29c8-4753-bf86-34b327119a7d","Type":"ContainerStarted","Data":"bc18cf0c4c24d7eff8f4f70c33f83efaf22b8b0d3f3ddc8d5cf155bea356ac77"} Oct 03 12:52:50 crc kubenswrapper[4962]: I1003 12:52:50.532960 4962 generic.go:334] "Generic (PLEG): container finished" podID="535af756-29c8-4753-bf86-34b327119a7d" containerID="bc18cf0c4c24d7eff8f4f70c33f83efaf22b8b0d3f3ddc8d5cf155bea356ac77" exitCode=0 Oct 03 12:52:50 crc kubenswrapper[4962]: I1003 12:52:50.533033 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5k7x6" event={"ID":"535af756-29c8-4753-bf86-34b327119a7d","Type":"ContainerDied","Data":"bc18cf0c4c24d7eff8f4f70c33f83efaf22b8b0d3f3ddc8d5cf155bea356ac77"} Oct 03 12:52:51 crc kubenswrapper[4962]: I1003 12:52:51.539740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5k7x6" event={"ID":"535af756-29c8-4753-bf86-34b327119a7d","Type":"ContainerStarted","Data":"ee6718235d5da59546034d0d2428a21d1df62a9a24aa98e8c17c95838766df2a"} Oct 03 12:52:51 crc kubenswrapper[4962]: I1003 12:52:51.557115 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5k7x6" podStartSLOduration=2.449994831 podStartE2EDuration="55.557095245s" podCreationTimestamp="2025-10-03 12:51:56 +0000 UTC" firstStartedPulling="2025-10-03 12:51:58.000186848 +0000 UTC m=+126.404084703" lastFinishedPulling="2025-10-03 12:52:51.107287282 +0000 UTC m=+179.511185117" observedRunningTime="2025-10-03 12:52:51.552123365 +0000 UTC m=+179.956021200" watchObservedRunningTime="2025-10-03 12:52:51.557095245 +0000 UTC m=+179.960993080" Oct 03 12:52:52 crc kubenswrapper[4962]: I1003 12:52:52.545447 4962 generic.go:334] "Generic (PLEG): container finished" podID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerID="77523ef451da2de35b189817c1c22c13e90c56869593a7343a6fafc1a5296e48" exitCode=0 Oct 03 12:52:52 crc kubenswrapper[4962]: I1003 12:52:52.545740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9wqz" event={"ID":"f791e711-ff4d-47b5-aa50-efbf71dc1ac2","Type":"ContainerDied","Data":"77523ef451da2de35b189817c1c22c13e90c56869593a7343a6fafc1a5296e48"} Oct 03 12:52:53 crc kubenswrapper[4962]: I1003 12:52:53.553970 4962 generic.go:334] "Generic (PLEG): container finished" podID="e2132e89-491f-4189-8c0f-349ace8209b4" containerID="cee7c61e98fade73c79e30eaf929595b01632aa9f19debef64dffd82e7aff40b" exitCode=0 Oct 03 12:52:53 crc kubenswrapper[4962]: I1003 12:52:53.554184 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmjqz" event={"ID":"e2132e89-491f-4189-8c0f-349ace8209b4","Type":"ContainerDied","Data":"cee7c61e98fade73c79e30eaf929595b01632aa9f19debef64dffd82e7aff40b"} Oct 03 12:52:53 crc kubenswrapper[4962]: I1003 12:52:53.556968 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96vmq" event={"ID":"965c12da-c517-4aa8-b67e-ddbe916b8578","Type":"ContainerStarted","Data":"debd82eb00613953e05a09da639913d73b096b50393c6d742396438e14ddb6a3"} Oct 03 12:52:53 crc kubenswrapper[4962]: I1003 12:52:53.560842 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9wqz" event={"ID":"f791e711-ff4d-47b5-aa50-efbf71dc1ac2","Type":"ContainerStarted","Data":"e84c35d2e207ae9b630a638a36337e7a1e7d4136d09b8ce59f7a233dad4cc821"} Oct 03 12:52:53 crc kubenswrapper[4962]: I1003 12:52:53.592086 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v9wqz" podStartSLOduration=3.617234172 podStartE2EDuration="58.59206871s" podCreationTimestamp="2025-10-03 12:51:55 +0000 UTC" firstStartedPulling="2025-10-03 12:51:57.987464268 +0000 UTC m=+126.391362103" lastFinishedPulling="2025-10-03 12:52:52.962298806 +0000 UTC m=+181.366196641" observedRunningTime="2025-10-03 12:52:53.591069512 +0000 UTC m=+181.994967347" watchObservedRunningTime="2025-10-03 12:52:53.59206871 +0000 UTC m=+181.995966545" Oct 03 12:52:53 crc kubenswrapper[4962]: E1003 12:52:53.757448 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965c12da_c517_4aa8_b67e_ddbe916b8578.slice/crio-debd82eb00613953e05a09da639913d73b096b50393c6d742396438e14ddb6a3.scope\": RecentStats: unable to find data in memory cache]" Oct 03 12:52:54 crc kubenswrapper[4962]: I1003 12:52:54.567657 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg2fc" event={"ID":"fdb2f075-4c59-41c9-b77e-550905415bdb","Type":"ContainerStarted","Data":"3670b788ad502734ac6ceffee966a5ecbf4479f1fa125a82e7c6cf13640db9fb"} Oct 03 12:52:54 crc kubenswrapper[4962]: I1003 12:52:54.568898 4962 generic.go:334] "Generic (PLEG): container finished" podID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerID="debd82eb00613953e05a09da639913d73b096b50393c6d742396438e14ddb6a3" exitCode=0 Oct 03 12:52:54 crc kubenswrapper[4962]: I1003 12:52:54.568919 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96vmq" event={"ID":"965c12da-c517-4aa8-b67e-ddbe916b8578","Type":"ContainerDied","Data":"debd82eb00613953e05a09da639913d73b096b50393c6d742396438e14ddb6a3"} Oct 03 12:52:54 crc kubenswrapper[4962]: I1003 12:52:54.659953 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:52:54 crc kubenswrapper[4962]: I1003 12:52:54.660230 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:52:55 crc kubenswrapper[4962]: I1003 12:52:55.577894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96vmq" event={"ID":"965c12da-c517-4aa8-b67e-ddbe916b8578","Type":"ContainerStarted","Data":"c61aa994a99386abdc5e10d99b7825923124280cba6139019c7245c60b14b3db"} Oct 03 12:52:55 crc kubenswrapper[4962]: I1003 12:52:55.584629 4962 generic.go:334] "Generic (PLEG): container finished" podID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerID="3670b788ad502734ac6ceffee966a5ecbf4479f1fa125a82e7c6cf13640db9fb" exitCode=0 Oct 03 12:52:55 crc kubenswrapper[4962]: I1003 12:52:55.584672 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg2fc" event={"ID":"fdb2f075-4c59-41c9-b77e-550905415bdb","Type":"ContainerDied","Data":"3670b788ad502734ac6ceffee966a5ecbf4479f1fa125a82e7c6cf13640db9fb"} Oct 03 12:52:55 crc kubenswrapper[4962]: I1003 12:52:55.587340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmjqz" event={"ID":"e2132e89-491f-4189-8c0f-349ace8209b4","Type":"ContainerStarted","Data":"e26e496e28fcb97e6034bd375623b8cc88459950b93f97da9d0c89da369d1b6b"} Oct 03 12:52:55 crc kubenswrapper[4962]: I1003 12:52:55.600914 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-96vmq" podStartSLOduration=3.528287509 podStartE2EDuration="56.600896759s" podCreationTimestamp="2025-10-03 12:51:59 +0000 UTC" firstStartedPulling="2025-10-03 12:52:02.230001674 +0000 UTC m=+130.633899509" lastFinishedPulling="2025-10-03 12:52:55.302610924 +0000 UTC m=+183.706508759" observedRunningTime="2025-10-03 12:52:55.59811029 +0000 UTC m=+184.002008135" watchObservedRunningTime="2025-10-03 12:52:55.600896759 +0000 UTC m=+184.004794594" Oct 03 12:52:55 crc kubenswrapper[4962]: I1003 12:52:55.614457 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tmjqz" podStartSLOduration=3.8849845160000003 podStartE2EDuration="56.61444067s" podCreationTimestamp="2025-10-03 12:51:59 +0000 UTC" firstStartedPulling="2025-10-03 12:52:02.218749685 +0000 UTC m=+130.622647520" lastFinishedPulling="2025-10-03 12:52:54.948205839 +0000 UTC m=+183.352103674" observedRunningTime="2025-10-03 12:52:55.612916847 +0000 UTC m=+184.016814682" watchObservedRunningTime="2025-10-03 12:52:55.61444067 +0000 UTC m=+184.018338505" Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.332232 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.332572 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.593688 4962 generic.go:334] "Generic (PLEG): container finished" podID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerID="2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82" exitCode=0 Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.593772 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr86x" event={"ID":"74bc700b-741d-4d4c-8758-9c259caa9f4b","Type":"ContainerDied","Data":"2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82"} Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.600305 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg2fc" event={"ID":"fdb2f075-4c59-41c9-b77e-550905415bdb","Type":"ContainerStarted","Data":"826312946b42a026d8f97f543fcc30e5e3d2387c23845c52a1c31a73f6b45777"} Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.701668 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.703329 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.963128 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:52:56 crc kubenswrapper[4962]: I1003 12:52:56.963680 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:52:57 crc kubenswrapper[4962]: I1003 12:52:57.607096 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgclz" event={"ID":"39aab071-afe0-4e3a-b33c-a758f5e1f673","Type":"ContainerStarted","Data":"6afa30699ba11b3d61f64acc4b35e8cd5e016d5fe94edf161ad09a2aea9522b6"} Oct 03 12:52:57 crc kubenswrapper[4962]: I1003 12:52:57.627930 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dg2fc" podStartSLOduration=3.658877872 podStartE2EDuration="1m1.627911879s" podCreationTimestamp="2025-10-03 12:51:56 +0000 UTC" firstStartedPulling="2025-10-03 12:51:58.027564979 +0000 UTC m=+126.431462824" lastFinishedPulling="2025-10-03 12:52:55.996598996 +0000 UTC m=+184.400496831" observedRunningTime="2025-10-03 12:52:57.623094683 +0000 UTC m=+186.026992518" watchObservedRunningTime="2025-10-03 12:52:57.627911879 +0000 UTC m=+186.031809714" Oct 03 12:52:57 crc kubenswrapper[4962]: I1003 12:52:57.647285 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:52:58 crc kubenswrapper[4962]: I1003 12:52:58.614361 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr86x" event={"ID":"74bc700b-741d-4d4c-8758-9c259caa9f4b","Type":"ContainerStarted","Data":"6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0"} Oct 03 12:52:58 crc kubenswrapper[4962]: I1003 12:52:58.616578 4962 generic.go:334] "Generic (PLEG): container finished" podID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerID="6afa30699ba11b3d61f64acc4b35e8cd5e016d5fe94edf161ad09a2aea9522b6" exitCode=0 Oct 03 12:52:58 crc kubenswrapper[4962]: I1003 12:52:58.616652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgclz" event={"ID":"39aab071-afe0-4e3a-b33c-a758f5e1f673","Type":"ContainerDied","Data":"6afa30699ba11b3d61f64acc4b35e8cd5e016d5fe94edf161ad09a2aea9522b6"} Oct 03 12:52:58 crc kubenswrapper[4962]: I1003 12:52:58.636984 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sr86x" podStartSLOduration=2.895102285 podStartE2EDuration="1m2.63696974s" podCreationTimestamp="2025-10-03 12:51:56 +0000 UTC" firstStartedPulling="2025-10-03 12:51:57.996423474 +0000 UTC m=+126.400321309" lastFinishedPulling="2025-10-03 12:52:57.738290929 +0000 UTC m=+186.142188764" observedRunningTime="2025-10-03 12:52:58.635776676 +0000 UTC m=+187.039674511" watchObservedRunningTime="2025-10-03 12:52:58.63696974 +0000 UTC m=+187.040867575" Oct 03 12:52:58 crc kubenswrapper[4962]: I1003 12:52:58.763397 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5k7x6"] Oct 03 12:52:59 crc kubenswrapper[4962]: I1003 12:52:59.510572 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:52:59 crc kubenswrapper[4962]: I1003 12:52:59.510905 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:52:59 crc kubenswrapper[4962]: I1003 12:52:59.623254 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerID="4a17a218405dc1e818c20e8feb0f572b256ccf6c3320ee8eaf6c266e5b63ea63" exitCode=0 Oct 03 12:52:59 crc kubenswrapper[4962]: I1003 12:52:59.623306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csv2p" event={"ID":"a3f948ba-fd31-4599-a860-e2c7deb505f9","Type":"ContainerDied","Data":"4a17a218405dc1e818c20e8feb0f572b256ccf6c3320ee8eaf6c266e5b63ea63"} Oct 03 12:52:59 crc kubenswrapper[4962]: I1003 12:52:59.625049 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5k7x6" podUID="535af756-29c8-4753-bf86-34b327119a7d" containerName="registry-server" containerID="cri-o://ee6718235d5da59546034d0d2428a21d1df62a9a24aa98e8c17c95838766df2a" gracePeriod=2 Oct 03 12:52:59 crc kubenswrapper[4962]: I1003 12:52:59.953146 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:52:59 crc kubenswrapper[4962]: I1003 12:52:59.953198 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:52:59 crc kubenswrapper[4962]: I1003 12:52:59.985603 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:53:00 crc kubenswrapper[4962]: I1003 12:53:00.548188 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-96vmq" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="registry-server" probeResult="failure" output=< Oct 03 12:53:00 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Oct 03 12:53:00 crc kubenswrapper[4962]: > Oct 03 12:53:00 crc kubenswrapper[4962]: I1003 12:53:00.631625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csv2p" event={"ID":"a3f948ba-fd31-4599-a860-e2c7deb505f9","Type":"ContainerStarted","Data":"d98042ee8e0e5647c93876caeae96b4db66ac7e628030627f41f4759f481ae02"} Oct 03 12:53:00 crc kubenswrapper[4962]: I1003 12:53:00.633294 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgclz" event={"ID":"39aab071-afe0-4e3a-b33c-a758f5e1f673","Type":"ContainerStarted","Data":"d009286be4e4e5f2783f51df8d07f0808a417d4879efda1975e5f5e68b8c6a81"} Oct 03 12:53:00 crc kubenswrapper[4962]: I1003 12:53:00.635333 4962 generic.go:334] "Generic (PLEG): container finished" podID="535af756-29c8-4753-bf86-34b327119a7d" containerID="ee6718235d5da59546034d0d2428a21d1df62a9a24aa98e8c17c95838766df2a" exitCode=0 Oct 03 12:53:00 crc kubenswrapper[4962]: I1003 12:53:00.635386 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5k7x6" event={"ID":"535af756-29c8-4753-bf86-34b327119a7d","Type":"ContainerDied","Data":"ee6718235d5da59546034d0d2428a21d1df62a9a24aa98e8c17c95838766df2a"} Oct 03 12:53:00 crc kubenswrapper[4962]: I1003 12:53:00.649497 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-csv2p" podStartSLOduration=2.274917078 podStartE2EDuration="1m2.649481371s" podCreationTimestamp="2025-10-03 12:51:58 +0000 UTC" firstStartedPulling="2025-10-03 12:52:00.121929128 +0000 UTC m=+128.525826963" lastFinishedPulling="2025-10-03 12:53:00.496493431 +0000 UTC m=+188.900391256" observedRunningTime="2025-10-03 12:53:00.64729457 +0000 UTC m=+189.051192425" watchObservedRunningTime="2025-10-03 12:53:00.649481371 +0000 UTC m=+189.053379206" Oct 03 12:53:00 crc kubenswrapper[4962]: I1003 12:53:00.666265 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cgclz" podStartSLOduration=3.298957666 podStartE2EDuration="1m2.666247534s" podCreationTimestamp="2025-10-03 12:51:58 +0000 UTC" firstStartedPulling="2025-10-03 12:52:00.129041533 +0000 UTC m=+128.532939368" lastFinishedPulling="2025-10-03 12:52:59.496331401 +0000 UTC m=+187.900229236" observedRunningTime="2025-10-03 12:53:00.664717781 +0000 UTC m=+189.068615626" watchObservedRunningTime="2025-10-03 12:53:00.666247534 +0000 UTC m=+189.070145369" Oct 03 12:53:00 crc kubenswrapper[4962]: I1003 12:53:00.676142 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.010098 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.045797 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-catalog-content\") pod \"535af756-29c8-4753-bf86-34b327119a7d\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.045872 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-utilities\") pod \"535af756-29c8-4753-bf86-34b327119a7d\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.045913 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdp4c\" (UniqueName: \"kubernetes.io/projected/535af756-29c8-4753-bf86-34b327119a7d-kube-api-access-bdp4c\") pod \"535af756-29c8-4753-bf86-34b327119a7d\" (UID: \"535af756-29c8-4753-bf86-34b327119a7d\") " Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.046749 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-utilities" (OuterVolumeSpecName: "utilities") pod "535af756-29c8-4753-bf86-34b327119a7d" (UID: "535af756-29c8-4753-bf86-34b327119a7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.055673 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535af756-29c8-4753-bf86-34b327119a7d-kube-api-access-bdp4c" (OuterVolumeSpecName: "kube-api-access-bdp4c") pod "535af756-29c8-4753-bf86-34b327119a7d" (UID: "535af756-29c8-4753-bf86-34b327119a7d"). InnerVolumeSpecName "kube-api-access-bdp4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.092657 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "535af756-29c8-4753-bf86-34b327119a7d" (UID: "535af756-29c8-4753-bf86-34b327119a7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.148168 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.148205 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdp4c\" (UniqueName: \"kubernetes.io/projected/535af756-29c8-4753-bf86-34b327119a7d-kube-api-access-bdp4c\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.148217 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/535af756-29c8-4753-bf86-34b327119a7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.166092 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmjqz"] Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.641924 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5k7x6" event={"ID":"535af756-29c8-4753-bf86-34b327119a7d","Type":"ContainerDied","Data":"931acbbc05c4f5e1ec5e4d2fedfc464a08eac0a85dec5132522b9aceae64a22b"} Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.641966 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5k7x6" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.641998 4962 scope.go:117] "RemoveContainer" containerID="ee6718235d5da59546034d0d2428a21d1df62a9a24aa98e8c17c95838766df2a" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.659833 4962 scope.go:117] "RemoveContainer" containerID="bc18cf0c4c24d7eff8f4f70c33f83efaf22b8b0d3f3ddc8d5cf155bea356ac77" Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.673290 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5k7x6"] Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.675807 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5k7x6"] Oct 03 12:53:01 crc kubenswrapper[4962]: I1003 12:53:01.684317 4962 scope.go:117] "RemoveContainer" containerID="6f6da6dde528c7b86f204d0d8ffff4c317bd461d5bac3ee09ce8eb5df0c9e8a5" Oct 03 12:53:02 crc kubenswrapper[4962]: I1003 12:53:02.236103 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="535af756-29c8-4753-bf86-34b327119a7d" path="/var/lib/kubelet/pods/535af756-29c8-4753-bf86-34b327119a7d/volumes" Oct 03 12:53:02 crc kubenswrapper[4962]: I1003 12:53:02.648781 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tmjqz" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" containerName="registry-server" containerID="cri-o://e26e496e28fcb97e6034bd375623b8cc88459950b93f97da9d0c89da369d1b6b" gracePeriod=2 Oct 03 12:53:05 crc kubenswrapper[4962]: I1003 12:53:05.663250 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tmjqz_e2132e89-491f-4189-8c0f-349ace8209b4/registry-server/0.log" Oct 03 12:53:05 crc kubenswrapper[4962]: I1003 12:53:05.664828 4962 generic.go:334] "Generic (PLEG): container finished" podID="e2132e89-491f-4189-8c0f-349ace8209b4" containerID="e26e496e28fcb97e6034bd375623b8cc88459950b93f97da9d0c89da369d1b6b" exitCode=137 Oct 03 12:53:05 crc kubenswrapper[4962]: I1003 12:53:05.664886 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmjqz" event={"ID":"e2132e89-491f-4189-8c0f-349ace8209b4","Type":"ContainerDied","Data":"e26e496e28fcb97e6034bd375623b8cc88459950b93f97da9d0c89da369d1b6b"} Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.360115 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tmjqz_e2132e89-491f-4189-8c0f-349ace8209b4/registry-server/0.log" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.361125 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.382223 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.412539 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-utilities\") pod \"e2132e89-491f-4189-8c0f-349ace8209b4\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.412615 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvkdg\" (UniqueName: \"kubernetes.io/projected/e2132e89-491f-4189-8c0f-349ace8209b4-kube-api-access-zvkdg\") pod \"e2132e89-491f-4189-8c0f-349ace8209b4\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.412672 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-catalog-content\") pod \"e2132e89-491f-4189-8c0f-349ace8209b4\" (UID: \"e2132e89-491f-4189-8c0f-349ace8209b4\") " Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.414236 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-utilities" (OuterVolumeSpecName: "utilities") pod "e2132e89-491f-4189-8c0f-349ace8209b4" (UID: "e2132e89-491f-4189-8c0f-349ace8209b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.428922 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2132e89-491f-4189-8c0f-349ace8209b4-kube-api-access-zvkdg" (OuterVolumeSpecName: "kube-api-access-zvkdg") pod "e2132e89-491f-4189-8c0f-349ace8209b4" (UID: "e2132e89-491f-4189-8c0f-349ace8209b4"). InnerVolumeSpecName "kube-api-access-zvkdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.489194 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.489256 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.514163 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvkdg\" (UniqueName: \"kubernetes.io/projected/e2132e89-491f-4189-8c0f-349ace8209b4-kube-api-access-zvkdg\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.514189 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.545205 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.671542 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tmjqz_e2132e89-491f-4189-8c0f-349ace8209b4/registry-server/0.log" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.673101 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmjqz" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.673148 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmjqz" event={"ID":"e2132e89-491f-4189-8c0f-349ace8209b4","Type":"ContainerDied","Data":"7cb8a3acc6a5beef8126e4b2ac1721d921bc7c7f7c246a6ccd3028b1022eacaa"} Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.673203 4962 scope.go:117] "RemoveContainer" containerID="e26e496e28fcb97e6034bd375623b8cc88459950b93f97da9d0c89da369d1b6b" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.687444 4962 scope.go:117] "RemoveContainer" containerID="cee7c61e98fade73c79e30eaf929595b01632aa9f19debef64dffd82e7aff40b" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.699774 4962 scope.go:117] "RemoveContainer" containerID="b791e1c9b61331b65e8c2e65b705113d2c826304f939c85e51ea08f603f8eeb4" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.710384 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.952859 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.953194 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:53:06 crc kubenswrapper[4962]: I1003 12:53:06.992729 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:53:07 crc kubenswrapper[4962]: I1003 12:53:07.635551 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2132e89-491f-4189-8c0f-349ace8209b4" (UID: "e2132e89-491f-4189-8c0f-349ace8209b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:53:07 crc kubenswrapper[4962]: I1003 12:53:07.721480 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:53:07 crc kubenswrapper[4962]: I1003 12:53:07.728487 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2132e89-491f-4189-8c0f-349ace8209b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:07 crc kubenswrapper[4962]: I1003 12:53:07.917876 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmjqz"] Oct 03 12:53:07 crc kubenswrapper[4962]: I1003 12:53:07.921818 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tmjqz"] Oct 03 12:53:08 crc kubenswrapper[4962]: I1003 12:53:08.232896 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" path="/var/lib/kubelet/pods/e2132e89-491f-4189-8c0f-349ace8209b4/volumes" Oct 03 12:53:08 crc kubenswrapper[4962]: I1003 12:53:08.527298 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:53:08 crc kubenswrapper[4962]: I1003 12:53:08.527394 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:53:08 crc kubenswrapper[4962]: I1003 12:53:08.593287 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:53:08 crc kubenswrapper[4962]: I1003 12:53:08.715970 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:53:08 crc kubenswrapper[4962]: I1003 12:53:08.891785 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:53:08 crc kubenswrapper[4962]: I1003 12:53:08.891835 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:53:08 crc kubenswrapper[4962]: I1003 12:53:08.931180 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:53:09 crc kubenswrapper[4962]: I1003 12:53:09.555593 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:53:09 crc kubenswrapper[4962]: I1003 12:53:09.595265 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:53:09 crc kubenswrapper[4962]: I1003 12:53:09.726720 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:53:09 crc kubenswrapper[4962]: I1003 12:53:09.961367 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sr86x"] Oct 03 12:53:10 crc kubenswrapper[4962]: I1003 12:53:10.690297 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sr86x" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerName="registry-server" containerID="cri-o://6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0" gracePeriod=2 Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.482227 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.588316 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-utilities\") pod \"74bc700b-741d-4d4c-8758-9c259caa9f4b\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.588400 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smrmr\" (UniqueName: \"kubernetes.io/projected/74bc700b-741d-4d4c-8758-9c259caa9f4b-kube-api-access-smrmr\") pod \"74bc700b-741d-4d4c-8758-9c259caa9f4b\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.588442 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-catalog-content\") pod \"74bc700b-741d-4d4c-8758-9c259caa9f4b\" (UID: \"74bc700b-741d-4d4c-8758-9c259caa9f4b\") " Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.589418 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-utilities" (OuterVolumeSpecName: "utilities") pod "74bc700b-741d-4d4c-8758-9c259caa9f4b" (UID: "74bc700b-741d-4d4c-8758-9c259caa9f4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.596830 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74bc700b-741d-4d4c-8758-9c259caa9f4b-kube-api-access-smrmr" (OuterVolumeSpecName: "kube-api-access-smrmr") pod "74bc700b-741d-4d4c-8758-9c259caa9f4b" (UID: "74bc700b-741d-4d4c-8758-9c259caa9f4b"). InnerVolumeSpecName "kube-api-access-smrmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.632420 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74bc700b-741d-4d4c-8758-9c259caa9f4b" (UID: "74bc700b-741d-4d4c-8758-9c259caa9f4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.689442 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.689474 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smrmr\" (UniqueName: \"kubernetes.io/projected/74bc700b-741d-4d4c-8758-9c259caa9f4b-kube-api-access-smrmr\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.689487 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bc700b-741d-4d4c-8758-9c259caa9f4b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.700967 4962 generic.go:334] "Generic (PLEG): container finished" podID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerID="6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0" exitCode=0 Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.701008 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr86x" event={"ID":"74bc700b-741d-4d4c-8758-9c259caa9f4b","Type":"ContainerDied","Data":"6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0"} Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.701040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr86x" event={"ID":"74bc700b-741d-4d4c-8758-9c259caa9f4b","Type":"ContainerDied","Data":"9a3e713b316e1364cde9cf594a99f6d7fdd167183adc2978108aa566a5df4efe"} Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.701062 4962 scope.go:117] "RemoveContainer" containerID="6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.702410 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr86x" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.714360 4962 scope.go:117] "RemoveContainer" containerID="2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.730762 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sr86x"] Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.733814 4962 scope.go:117] "RemoveContainer" containerID="3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.734419 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sr86x"] Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.761663 4962 scope.go:117] "RemoveContainer" containerID="6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0" Oct 03 12:53:12 crc kubenswrapper[4962]: E1003 12:53:12.762115 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0\": container with ID starting with 6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0 not found: ID does not exist" containerID="6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.762140 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0"} err="failed to get container status \"6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0\": rpc error: code = NotFound desc = could not find container \"6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0\": container with ID starting with 6806ff55952f50dbe2df622b46fb17b4faf64ad7e0fcc1265342e271c899d9a0 not found: ID does not exist" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.762177 4962 scope.go:117] "RemoveContainer" containerID="2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82" Oct 03 12:53:12 crc kubenswrapper[4962]: E1003 12:53:12.762407 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82\": container with ID starting with 2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82 not found: ID does not exist" containerID="2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.762432 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82"} err="failed to get container status \"2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82\": rpc error: code = NotFound desc = could not find container \"2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82\": container with ID starting with 2927406bb8c515c8d04728f36a3c076977de51291017a9f3680dd9d392f87f82 not found: ID does not exist" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.762444 4962 scope.go:117] "RemoveContainer" containerID="3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7" Oct 03 12:53:12 crc kubenswrapper[4962]: E1003 12:53:12.762858 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7\": container with ID starting with 3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7 not found: ID does not exist" containerID="3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.762878 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7"} err="failed to get container status \"3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7\": rpc error: code = NotFound desc = could not find container \"3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7\": container with ID starting with 3e22fd709d3d76a825998f66fbff822b6f7530a60e7383723453da922f1362d7 not found: ID does not exist" Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.959797 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-csv2p"] Oct 03 12:53:12 crc kubenswrapper[4962]: I1003 12:53:12.960386 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-csv2p" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerName="registry-server" containerID="cri-o://d98042ee8e0e5647c93876caeae96b4db66ac7e628030627f41f4759f481ae02" gracePeriod=2 Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.241648 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" path="/var/lib/kubelet/pods/74bc700b-741d-4d4c-8758-9c259caa9f4b/volumes" Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.713184 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerID="d98042ee8e0e5647c93876caeae96b4db66ac7e628030627f41f4759f481ae02" exitCode=0 Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.713229 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csv2p" event={"ID":"a3f948ba-fd31-4599-a860-e2c7deb505f9","Type":"ContainerDied","Data":"d98042ee8e0e5647c93876caeae96b4db66ac7e628030627f41f4759f481ae02"} Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.822063 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.911536 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh7s9\" (UniqueName: \"kubernetes.io/projected/a3f948ba-fd31-4599-a860-e2c7deb505f9-kube-api-access-sh7s9\") pod \"a3f948ba-fd31-4599-a860-e2c7deb505f9\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.911706 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-catalog-content\") pod \"a3f948ba-fd31-4599-a860-e2c7deb505f9\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.911736 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-utilities\") pod \"a3f948ba-fd31-4599-a860-e2c7deb505f9\" (UID: \"a3f948ba-fd31-4599-a860-e2c7deb505f9\") " Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.912404 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-utilities" (OuterVolumeSpecName: "utilities") pod "a3f948ba-fd31-4599-a860-e2c7deb505f9" (UID: "a3f948ba-fd31-4599-a860-e2c7deb505f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.916256 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f948ba-fd31-4599-a860-e2c7deb505f9-kube-api-access-sh7s9" (OuterVolumeSpecName: "kube-api-access-sh7s9") pod "a3f948ba-fd31-4599-a860-e2c7deb505f9" (UID: "a3f948ba-fd31-4599-a860-e2c7deb505f9"). InnerVolumeSpecName "kube-api-access-sh7s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:53:14 crc kubenswrapper[4962]: I1003 12:53:14.924785 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3f948ba-fd31-4599-a860-e2c7deb505f9" (UID: "a3f948ba-fd31-4599-a860-e2c7deb505f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.013547 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh7s9\" (UniqueName: \"kubernetes.io/projected/a3f948ba-fd31-4599-a860-e2c7deb505f9-kube-api-access-sh7s9\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.013590 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.013602 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f948ba-fd31-4599-a860-e2c7deb505f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.722559 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csv2p" event={"ID":"a3f948ba-fd31-4599-a860-e2c7deb505f9","Type":"ContainerDied","Data":"78592bd1da4b8af2fc9bafb43d1555b357d260442215be57e5a01d61b07ed50a"} Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.722610 4962 scope.go:117] "RemoveContainer" containerID="d98042ee8e0e5647c93876caeae96b4db66ac7e628030627f41f4759f481ae02" Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.722625 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-csv2p" Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.739503 4962 scope.go:117] "RemoveContainer" containerID="4a17a218405dc1e818c20e8feb0f572b256ccf6c3320ee8eaf6c266e5b63ea63" Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.749117 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-csv2p"] Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.751621 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-csv2p"] Oct 03 12:53:15 crc kubenswrapper[4962]: I1003 12:53:15.756871 4962 scope.go:117] "RemoveContainer" containerID="b737e562bf7be11530e11ff91579d10a6179ff33ca3a5c9eb90439171f97a0f0" Oct 03 12:53:16 crc kubenswrapper[4962]: I1003 12:53:16.232966 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" path="/var/lib/kubelet/pods/a3f948ba-fd31-4599-a860-e2c7deb505f9/volumes" Oct 03 12:53:20 crc kubenswrapper[4962]: I1003 12:53:20.062273 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 12:53:22 crc kubenswrapper[4962]: I1003 12:53:22.238975 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q777j"] Oct 03 12:53:24 crc kubenswrapper[4962]: I1003 12:53:24.660130 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:53:24 crc kubenswrapper[4962]: I1003 12:53:24.661392 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:53:24 crc kubenswrapper[4962]: I1003 12:53:24.661521 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:53:24 crc kubenswrapper[4962]: I1003 12:53:24.662251 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:53:24 crc kubenswrapper[4962]: I1003 12:53:24.662402 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738" gracePeriod=600 Oct 03 12:53:25 crc kubenswrapper[4962]: I1003 12:53:25.780820 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738" exitCode=0 Oct 03 12:53:25 crc kubenswrapper[4962]: I1003 12:53:25.780892 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738"} Oct 03 12:53:25 crc kubenswrapper[4962]: I1003 12:53:25.781931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"ff1cb175dbd5e3b73b2d393894d44cdfd3e2d2b6ac48ceeab83ef956a3efba04"} Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.262987 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" podUID="3e3f372c-8948-4d84-aee2-441d77e3201a" containerName="oauth-openshift" containerID="cri-o://506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e" gracePeriod=15 Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.588777 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.621607 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-76dcffc8cc-pq59v"] Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.621916 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerName="extract-content" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.621940 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerName="extract-content" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.621952 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" containerName="extract-utilities" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.621960 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" containerName="extract-utilities" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.621973 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535af756-29c8-4753-bf86-34b327119a7d" containerName="extract-utilities" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.621983 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="535af756-29c8-4753-bf86-34b327119a7d" containerName="extract-utilities" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.621995 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622003 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622015 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3f372c-8948-4d84-aee2-441d77e3201a" containerName="oauth-openshift" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622023 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3f372c-8948-4d84-aee2-441d77e3201a" containerName="oauth-openshift" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622034 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerName="extract-utilities" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622042 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerName="extract-utilities" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622053 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerName="extract-content" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622060 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerName="extract-content" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622071 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622079 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622091 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535af756-29c8-4753-bf86-34b327119a7d" containerName="extract-content" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622098 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="535af756-29c8-4753-bf86-34b327119a7d" containerName="extract-content" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622110 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerName="extract-utilities" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622118 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerName="extract-utilities" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622129 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622138 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622146 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535af756-29c8-4753-bf86-34b327119a7d" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622154 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="535af756-29c8-4753-bf86-34b327119a7d" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622166 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" containerName="extract-content" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622175 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" containerName="extract-content" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.622184 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb" containerName="pruner" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622192 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb" containerName="pruner" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622292 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3f372c-8948-4d84-aee2-441d77e3201a" containerName="oauth-openshift" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622306 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="74bc700b-741d-4d4c-8758-9c259caa9f4b" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622317 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="535af756-29c8-4753-bf86-34b327119a7d" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622326 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2132e89-491f-4189-8c0f-349ace8209b4" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622335 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a32d82a0-cd3f-4d2c-a0d5-63ec3c102dcb" containerName="pruner" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.622345 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f948ba-fd31-4599-a860-e2c7deb505f9" containerName="registry-server" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.627719 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.635748 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76dcffc8cc-pq59v"] Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690153 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-dir\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690200 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-session\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690235 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-service-ca\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690249 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690279 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-login\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-policies\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690310 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-error\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690326 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-router-certs\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690359 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-serving-cert\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690377 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-ocp-branding-template\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690399 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-idp-0-file-data\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690417 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-cliconfig\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690448 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjfd2\" (UniqueName: \"kubernetes.io/projected/3e3f372c-8948-4d84-aee2-441d77e3201a-kube-api-access-tjfd2\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690474 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-provider-selection\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690490 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-trusted-ca-bundle\") pod \"3e3f372c-8948-4d84-aee2-441d77e3201a\" (UID: \"3e3f372c-8948-4d84-aee2-441d77e3201a\") " Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.690698 4962 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.691154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.691680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.691728 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.692154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.699828 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.699854 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3f372c-8948-4d84-aee2-441d77e3201a-kube-api-access-tjfd2" (OuterVolumeSpecName: "kube-api-access-tjfd2") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "kube-api-access-tjfd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.700433 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.700613 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.700941 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.701127 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.701557 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.701749 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.702032 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3e3f372c-8948-4d84-aee2-441d77e3201a" (UID: "3e3f372c-8948-4d84-aee2-441d77e3201a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-error\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792131 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-login\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792186 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792208 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-audit-policies\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792245 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-session\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792433 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792454 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-service-ca\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln5w4\" (UniqueName: \"kubernetes.io/projected/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-kube-api-access-ln5w4\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792677 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-router-certs\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792726 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-audit-dir\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792771 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.792889 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793048 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793062 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793075 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793086 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793097 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793110 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793123 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793133 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793143 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjfd2\" (UniqueName: \"kubernetes.io/projected/3e3f372c-8948-4d84-aee2-441d77e3201a-kube-api-access-tjfd2\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793155 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793166 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793180 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.793190 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e3f372c-8948-4d84-aee2-441d77e3201a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.890433 4962 generic.go:334] "Generic (PLEG): container finished" podID="3e3f372c-8948-4d84-aee2-441d77e3201a" containerID="506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e" exitCode=0 Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.890496 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" event={"ID":"3e3f372c-8948-4d84-aee2-441d77e3201a","Type":"ContainerDied","Data":"506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e"} Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.890530 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.890586 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q777j" event={"ID":"3e3f372c-8948-4d84-aee2-441d77e3201a","Type":"ContainerDied","Data":"381397d9330b258a906bab5f805296f4bc8d365f616c2793f30b75763fe2bf6c"} Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.890628 4962 scope.go:117] "RemoveContainer" containerID="506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.893961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-router-certs\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-audit-dir\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894121 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-error\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894319 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-login\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894592 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-audit-policies\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894689 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894753 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-session\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-service-ca\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.894947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln5w4\" (UniqueName: \"kubernetes.io/projected/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-kube-api-access-ln5w4\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.895352 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.897997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-audit-policies\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.898158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-audit-dir\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.898794 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-service-ca\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.898993 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-login\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.899297 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.900213 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.900943 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-session\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.902006 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.902032 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.905594 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-router-certs\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.905900 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-user-template-error\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.911017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.915346 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln5w4\" (UniqueName: \"kubernetes.io/projected/2ce4b637-e3d5-40a2-aab5-c70072b40cf5-kube-api-access-ln5w4\") pod \"oauth-openshift-76dcffc8cc-pq59v\" (UID: \"2ce4b637-e3d5-40a2-aab5-c70072b40cf5\") " pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.932602 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q777j"] Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.934853 4962 scope.go:117] "RemoveContainer" containerID="506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e" Oct 03 12:53:47 crc kubenswrapper[4962]: E1003 12:53:47.935573 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e\": container with ID starting with 506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e not found: ID does not exist" containerID="506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.935648 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e"} err="failed to get container status \"506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e\": rpc error: code = NotFound desc = could not find container \"506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e\": container with ID starting with 506813cd13577fc65e9dc8a3c28804090538ddc490d5e5bda602d1f44da15a2e not found: ID does not exist" Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.942439 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q777j"] Oct 03 12:53:47 crc kubenswrapper[4962]: I1003 12:53:47.945276 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:48 crc kubenswrapper[4962]: I1003 12:53:48.156683 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76dcffc8cc-pq59v"] Oct 03 12:53:48 crc kubenswrapper[4962]: I1003 12:53:48.240331 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3f372c-8948-4d84-aee2-441d77e3201a" path="/var/lib/kubelet/pods/3e3f372c-8948-4d84-aee2-441d77e3201a/volumes" Oct 03 12:53:48 crc kubenswrapper[4962]: I1003 12:53:48.896539 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" event={"ID":"2ce4b637-e3d5-40a2-aab5-c70072b40cf5","Type":"ContainerStarted","Data":"dc2209ad2dcd0407218932a48f55ae69bae8b338944ebcb6d6fb2698080f1cdb"} Oct 03 12:53:48 crc kubenswrapper[4962]: I1003 12:53:48.896901 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" event={"ID":"2ce4b637-e3d5-40a2-aab5-c70072b40cf5","Type":"ContainerStarted","Data":"3a0e62b6dc66d9720a4d193c0b10c9dac7bd669db0ac13bc74288c2c1a797168"} Oct 03 12:53:48 crc kubenswrapper[4962]: I1003 12:53:48.896933 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:48 crc kubenswrapper[4962]: I1003 12:53:48.902342 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" Oct 03 12:53:48 crc kubenswrapper[4962]: I1003 12:53:48.919222 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-76dcffc8cc-pq59v" podStartSLOduration=26.919193008 podStartE2EDuration="26.919193008s" podCreationTimestamp="2025-10-03 12:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:53:48.912971547 +0000 UTC m=+237.316869402" watchObservedRunningTime="2025-10-03 12:53:48.919193008 +0000 UTC m=+237.323090863" Oct 03 12:55:02 crc kubenswrapper[4962]: I1003 12:55:02.988072 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v9wqz"] Oct 03 12:55:02 crc kubenswrapper[4962]: I1003 12:55:02.992451 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v9wqz" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerName="registry-server" containerID="cri-o://e84c35d2e207ae9b630a638a36337e7a1e7d4136d09b8ce59f7a233dad4cc821" gracePeriod=30 Oct 03 12:55:02 crc kubenswrapper[4962]: I1003 12:55:02.995899 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dg2fc"] Oct 03 12:55:02 crc kubenswrapper[4962]: I1003 12:55:02.996131 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dg2fc" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerName="registry-server" containerID="cri-o://826312946b42a026d8f97f543fcc30e5e3d2387c23845c52a1c31a73f6b45777" gracePeriod=30 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.005188 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdw2z"] Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.005430 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" podUID="cb5bb7ba-79c7-4251-8025-68e5c9997447" containerName="marketplace-operator" containerID="cri-o://363465b4562fc3c04e8a0d9e73af449ae5672e4ec239659e98559860970a81f7" gracePeriod=30 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.024040 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgclz"] Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.024320 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cgclz" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerName="registry-server" containerID="cri-o://d009286be4e4e5f2783f51df8d07f0808a417d4879efda1975e5f5e68b8c6a81" gracePeriod=30 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.035539 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-snkcz"] Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.036305 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.041009 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-96vmq"] Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.041314 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-96vmq" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="registry-server" containerID="cri-o://c61aa994a99386abdc5e10d99b7825923124280cba6139019c7245c60b14b3db" gracePeriod=30 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.052386 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-snkcz"] Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.115324 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9908d2b4-0cf3-4635-855c-39eb963de62c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.115393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9908d2b4-0cf3-4635-855c-39eb963de62c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.115460 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqjfl\" (UniqueName: \"kubernetes.io/projected/9908d2b4-0cf3-4635-855c-39eb963de62c-kube-api-access-cqjfl\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.216561 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9908d2b4-0cf3-4635-855c-39eb963de62c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.216794 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9908d2b4-0cf3-4635-855c-39eb963de62c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.216930 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqjfl\" (UniqueName: \"kubernetes.io/projected/9908d2b4-0cf3-4635-855c-39eb963de62c-kube-api-access-cqjfl\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.218286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9908d2b4-0cf3-4635-855c-39eb963de62c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.222117 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9908d2b4-0cf3-4635-855c-39eb963de62c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.235015 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqjfl\" (UniqueName: \"kubernetes.io/projected/9908d2b4-0cf3-4635-855c-39eb963de62c-kube-api-access-cqjfl\") pod \"marketplace-operator-79b997595-snkcz\" (UID: \"9908d2b4-0cf3-4635-855c-39eb963de62c\") " pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.286171 4962 generic.go:334] "Generic (PLEG): container finished" podID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerID="826312946b42a026d8f97f543fcc30e5e3d2387c23845c52a1c31a73f6b45777" exitCode=0 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.286278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg2fc" event={"ID":"fdb2f075-4c59-41c9-b77e-550905415bdb","Type":"ContainerDied","Data":"826312946b42a026d8f97f543fcc30e5e3d2387c23845c52a1c31a73f6b45777"} Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.288341 4962 generic.go:334] "Generic (PLEG): container finished" podID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerID="e84c35d2e207ae9b630a638a36337e7a1e7d4136d09b8ce59f7a233dad4cc821" exitCode=0 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.288416 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9wqz" event={"ID":"f791e711-ff4d-47b5-aa50-efbf71dc1ac2","Type":"ContainerDied","Data":"e84c35d2e207ae9b630a638a36337e7a1e7d4136d09b8ce59f7a233dad4cc821"} Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.299117 4962 generic.go:334] "Generic (PLEG): container finished" podID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerID="d009286be4e4e5f2783f51df8d07f0808a417d4879efda1975e5f5e68b8c6a81" exitCode=0 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.299190 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgclz" event={"ID":"39aab071-afe0-4e3a-b33c-a758f5e1f673","Type":"ContainerDied","Data":"d009286be4e4e5f2783f51df8d07f0808a417d4879efda1975e5f5e68b8c6a81"} Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.303181 4962 generic.go:334] "Generic (PLEG): container finished" podID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerID="c61aa994a99386abdc5e10d99b7825923124280cba6139019c7245c60b14b3db" exitCode=0 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.303215 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96vmq" event={"ID":"965c12da-c517-4aa8-b67e-ddbe916b8578","Type":"ContainerDied","Data":"c61aa994a99386abdc5e10d99b7825923124280cba6139019c7245c60b14b3db"} Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.306473 4962 generic.go:334] "Generic (PLEG): container finished" podID="cb5bb7ba-79c7-4251-8025-68e5c9997447" containerID="363465b4562fc3c04e8a0d9e73af449ae5672e4ec239659e98559860970a81f7" exitCode=0 Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.306503 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" event={"ID":"cb5bb7ba-79c7-4251-8025-68e5c9997447","Type":"ContainerDied","Data":"363465b4562fc3c04e8a0d9e73af449ae5672e4ec239659e98559860970a81f7"} Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.355558 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.475200 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.500563 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.509448 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.523970 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.534041 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.579839 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-snkcz"] Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.625469 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj6pv\" (UniqueName: \"kubernetes.io/projected/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-kube-api-access-nj6pv\") pod \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.625524 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-utilities\") pod \"965c12da-c517-4aa8-b67e-ddbe916b8578\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.625551 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-utilities\") pod \"fdb2f075-4c59-41c9-b77e-550905415bdb\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.625581 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-catalog-content\") pod \"39aab071-afe0-4e3a-b33c-a758f5e1f673\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.626268 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-utilities" (OuterVolumeSpecName: "utilities") pod "fdb2f075-4c59-41c9-b77e-550905415bdb" (UID: "fdb2f075-4c59-41c9-b77e-550905415bdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.626328 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-trusted-ca\") pod \"cb5bb7ba-79c7-4251-8025-68e5c9997447\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.626352 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-utilities\") pod \"39aab071-afe0-4e3a-b33c-a758f5e1f673\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.626443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-utilities" (OuterVolumeSpecName: "utilities") pod "965c12da-c517-4aa8-b67e-ddbe916b8578" (UID: "965c12da-c517-4aa8-b67e-ddbe916b8578"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cb5bb7ba-79c7-4251-8025-68e5c9997447" (UID: "cb5bb7ba-79c7-4251-8025-68e5c9997447"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627050 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-catalog-content\") pod \"965c12da-c517-4aa8-b67e-ddbe916b8578\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627085 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrl2n\" (UniqueName: \"kubernetes.io/projected/fdb2f075-4c59-41c9-b77e-550905415bdb-kube-api-access-rrl2n\") pod \"fdb2f075-4c59-41c9-b77e-550905415bdb\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627119 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwhld\" (UniqueName: \"kubernetes.io/projected/cb5bb7ba-79c7-4251-8025-68e5c9997447-kube-api-access-pwhld\") pod \"cb5bb7ba-79c7-4251-8025-68e5c9997447\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627434 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-catalog-content\") pod \"fdb2f075-4c59-41c9-b77e-550905415bdb\" (UID: \"fdb2f075-4c59-41c9-b77e-550905415bdb\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627457 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-operator-metrics\") pod \"cb5bb7ba-79c7-4251-8025-68e5c9997447\" (UID: \"cb5bb7ba-79c7-4251-8025-68e5c9997447\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2csgd\" (UniqueName: \"kubernetes.io/projected/965c12da-c517-4aa8-b67e-ddbe916b8578-kube-api-access-2csgd\") pod \"965c12da-c517-4aa8-b67e-ddbe916b8578\" (UID: \"965c12da-c517-4aa8-b67e-ddbe916b8578\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627522 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdnqh\" (UniqueName: \"kubernetes.io/projected/39aab071-afe0-4e3a-b33c-a758f5e1f673-kube-api-access-zdnqh\") pod \"39aab071-afe0-4e3a-b33c-a758f5e1f673\" (UID: \"39aab071-afe0-4e3a-b33c-a758f5e1f673\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627539 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-utilities\") pod \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627557 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-catalog-content\") pod \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\" (UID: \"f791e711-ff4d-47b5-aa50-efbf71dc1ac2\") " Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.627070 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-utilities" (OuterVolumeSpecName: "utilities") pod "39aab071-afe0-4e3a-b33c-a758f5e1f673" (UID: "39aab071-afe0-4e3a-b33c-a758f5e1f673"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.628706 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.628722 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.628752 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.628762 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.628979 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-utilities" (OuterVolumeSpecName: "utilities") pod "f791e711-ff4d-47b5-aa50-efbf71dc1ac2" (UID: "f791e711-ff4d-47b5-aa50-efbf71dc1ac2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.629958 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5bb7ba-79c7-4251-8025-68e5c9997447-kube-api-access-pwhld" (OuterVolumeSpecName: "kube-api-access-pwhld") pod "cb5bb7ba-79c7-4251-8025-68e5c9997447" (UID: "cb5bb7ba-79c7-4251-8025-68e5c9997447"). InnerVolumeSpecName "kube-api-access-pwhld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.630212 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-kube-api-access-nj6pv" (OuterVolumeSpecName: "kube-api-access-nj6pv") pod "f791e711-ff4d-47b5-aa50-efbf71dc1ac2" (UID: "f791e711-ff4d-47b5-aa50-efbf71dc1ac2"). InnerVolumeSpecName "kube-api-access-nj6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.630299 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb2f075-4c59-41c9-b77e-550905415bdb-kube-api-access-rrl2n" (OuterVolumeSpecName: "kube-api-access-rrl2n") pod "fdb2f075-4c59-41c9-b77e-550905415bdb" (UID: "fdb2f075-4c59-41c9-b77e-550905415bdb"). InnerVolumeSpecName "kube-api-access-rrl2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.632212 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39aab071-afe0-4e3a-b33c-a758f5e1f673-kube-api-access-zdnqh" (OuterVolumeSpecName: "kube-api-access-zdnqh") pod "39aab071-afe0-4e3a-b33c-a758f5e1f673" (UID: "39aab071-afe0-4e3a-b33c-a758f5e1f673"). InnerVolumeSpecName "kube-api-access-zdnqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.632232 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/965c12da-c517-4aa8-b67e-ddbe916b8578-kube-api-access-2csgd" (OuterVolumeSpecName: "kube-api-access-2csgd") pod "965c12da-c517-4aa8-b67e-ddbe916b8578" (UID: "965c12da-c517-4aa8-b67e-ddbe916b8578"). InnerVolumeSpecName "kube-api-access-2csgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.633470 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cb5bb7ba-79c7-4251-8025-68e5c9997447" (UID: "cb5bb7ba-79c7-4251-8025-68e5c9997447"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.651405 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39aab071-afe0-4e3a-b33c-a758f5e1f673" (UID: "39aab071-afe0-4e3a-b33c-a758f5e1f673"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.677678 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f791e711-ff4d-47b5-aa50-efbf71dc1ac2" (UID: "f791e711-ff4d-47b5-aa50-efbf71dc1ac2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.701140 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdb2f075-4c59-41c9-b77e-550905415bdb" (UID: "fdb2f075-4c59-41c9-b77e-550905415bdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.716602 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "965c12da-c517-4aa8-b67e-ddbe916b8578" (UID: "965c12da-c517-4aa8-b67e-ddbe916b8578"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.730446 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965c12da-c517-4aa8-b67e-ddbe916b8578-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.731475 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrl2n\" (UniqueName: \"kubernetes.io/projected/fdb2f075-4c59-41c9-b77e-550905415bdb-kube-api-access-rrl2n\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.731591 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwhld\" (UniqueName: \"kubernetes.io/projected/cb5bb7ba-79c7-4251-8025-68e5c9997447-kube-api-access-pwhld\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.731697 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdb2f075-4c59-41c9-b77e-550905415bdb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.731825 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb5bb7ba-79c7-4251-8025-68e5c9997447-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.731911 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2csgd\" (UniqueName: \"kubernetes.io/projected/965c12da-c517-4aa8-b67e-ddbe916b8578-kube-api-access-2csgd\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.732003 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdnqh\" (UniqueName: \"kubernetes.io/projected/39aab071-afe0-4e3a-b33c-a758f5e1f673-kube-api-access-zdnqh\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.732570 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.732711 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.732859 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj6pv\" (UniqueName: \"kubernetes.io/projected/f791e711-ff4d-47b5-aa50-efbf71dc1ac2-kube-api-access-nj6pv\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:03 crc kubenswrapper[4962]: I1003 12:55:03.732998 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39aab071-afe0-4e3a-b33c-a758f5e1f673-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.313405 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" event={"ID":"cb5bb7ba-79c7-4251-8025-68e5c9997447","Type":"ContainerDied","Data":"396a3c9af848cb4a60f82bf80db8bfc3c704af491f02527a28a54a5bfac35488"} Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.313455 4962 scope.go:117] "RemoveContainer" containerID="363465b4562fc3c04e8a0d9e73af449ae5672e4ec239659e98559860970a81f7" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.313452 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wdw2z" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.318211 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg2fc" event={"ID":"fdb2f075-4c59-41c9-b77e-550905415bdb","Type":"ContainerDied","Data":"f4509c3499dcec2d71b57b45d371e8462980948bd4237bc41403da692903ea0b"} Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.318229 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg2fc" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.320980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9wqz" event={"ID":"f791e711-ff4d-47b5-aa50-efbf71dc1ac2","Type":"ContainerDied","Data":"f47cfbc0b5455defa626748b5c5c8a86fe98ce950dae0e4d6fe76bd6010660ec"} Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.321023 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v9wqz" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.322972 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" event={"ID":"9908d2b4-0cf3-4635-855c-39eb963de62c","Type":"ContainerStarted","Data":"6f111291e24a50f52d26c81c85c816ea54c96ec27c444717d2fd31a65d51e909"} Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.323052 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.323070 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" event={"ID":"9908d2b4-0cf3-4635-855c-39eb963de62c","Type":"ContainerStarted","Data":"1a6c8b4f3f4748d62ab84cc5e09714ea6e84f5fbdcb9d8d0fe08c6543fa9798b"} Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.325518 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgclz" event={"ID":"39aab071-afe0-4e3a-b33c-a758f5e1f673","Type":"ContainerDied","Data":"02ed057ea3114adeb92c6ea43ffb400900e7536fac9935c15696012b7d0aecac"} Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.325679 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgclz" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.326539 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.331923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96vmq" event={"ID":"965c12da-c517-4aa8-b67e-ddbe916b8578","Type":"ContainerDied","Data":"ee163d22a67ce5695142055c11e898228a0f09fc32b062f202cd265995a56cfc"} Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.332036 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96vmq" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.334395 4962 scope.go:117] "RemoveContainer" containerID="826312946b42a026d8f97f543fcc30e5e3d2387c23845c52a1c31a73f6b45777" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.338795 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdw2z"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.346878 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdw2z"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.351156 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dg2fc"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.356288 4962 scope.go:117] "RemoveContainer" containerID="3670b788ad502734ac6ceffee966a5ecbf4479f1fa125a82e7c6cf13640db9fb" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.355630 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dg2fc"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.360026 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-snkcz" podStartSLOduration=1.3600054560000001 podStartE2EDuration="1.360005456s" podCreationTimestamp="2025-10-03 12:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:55:04.358922744 +0000 UTC m=+312.762820589" watchObservedRunningTime="2025-10-03 12:55:04.360005456 +0000 UTC m=+312.763903291" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.422724 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v9wqz"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.425507 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v9wqz"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.427545 4962 scope.go:117] "RemoveContainer" containerID="2fb5a3624121d07ac440067b6874ccb3b6b32295f5c1f61745ad205f27cc0639" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.438087 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-96vmq"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.442315 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-96vmq"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.452829 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgclz"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.453389 4962 scope.go:117] "RemoveContainer" containerID="e84c35d2e207ae9b630a638a36337e7a1e7d4136d09b8ce59f7a233dad4cc821" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.456823 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgclz"] Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.474367 4962 scope.go:117] "RemoveContainer" containerID="77523ef451da2de35b189817c1c22c13e90c56869593a7343a6fafc1a5296e48" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.491720 4962 scope.go:117] "RemoveContainer" containerID="6dc0bd5ff72311ef6f7be98020eacb1ed594f4edb80edec86eaad2588e7ab4a0" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.507037 4962 scope.go:117] "RemoveContainer" containerID="d009286be4e4e5f2783f51df8d07f0808a417d4879efda1975e5f5e68b8c6a81" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.523253 4962 scope.go:117] "RemoveContainer" containerID="6afa30699ba11b3d61f64acc4b35e8cd5e016d5fe94edf161ad09a2aea9522b6" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.539156 4962 scope.go:117] "RemoveContainer" containerID="0502fa40b6848f345ca906313a192c9bbd54bb706874032f630d5b3ee1281371" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.552541 4962 scope.go:117] "RemoveContainer" containerID="c61aa994a99386abdc5e10d99b7825923124280cba6139019c7245c60b14b3db" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.566380 4962 scope.go:117] "RemoveContainer" containerID="debd82eb00613953e05a09da639913d73b096b50393c6d742396438e14ddb6a3" Oct 03 12:55:04 crc kubenswrapper[4962]: I1003 12:55:04.579629 4962 scope.go:117] "RemoveContainer" containerID="1a112003ae6ec10c1cf1074b4b425eeb2e424f9c51c5fc3f1988f51a90f1597c" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.204555 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6zr"] Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205386 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205405 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205420 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerName="extract-content" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205428 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerName="extract-content" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205442 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerName="extract-utilities" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205450 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerName="extract-utilities" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205458 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerName="extract-utilities" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205465 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerName="extract-utilities" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205475 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="extract-utilities" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205487 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="extract-utilities" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205500 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205508 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205520 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="extract-content" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205527 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="extract-content" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205537 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerName="extract-utilities" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205544 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerName="extract-utilities" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205557 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205563 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205573 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205581 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205590 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerName="extract-content" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205599 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerName="extract-content" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205607 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5bb7ba-79c7-4251-8025-68e5c9997447" containerName="marketplace-operator" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205614 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5bb7ba-79c7-4251-8025-68e5c9997447" containerName="marketplace-operator" Oct 03 12:55:05 crc kubenswrapper[4962]: E1003 12:55:05.205623 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerName="extract-content" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205630 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerName="extract-content" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205798 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205810 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205818 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205854 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" containerName="registry-server" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.205865 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5bb7ba-79c7-4251-8025-68e5c9997447" containerName="marketplace-operator" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.210100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.212416 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.214321 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6zr"] Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.353859 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-catalog-content\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.353899 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqr66\" (UniqueName: \"kubernetes.io/projected/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-kube-api-access-kqr66\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.354004 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-utilities\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.401926 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bz646"] Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.403089 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.406366 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.410093 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bz646"] Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.454889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-utilities\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.454958 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-catalog-content\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.454989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqr66\" (UniqueName: \"kubernetes.io/projected/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-kube-api-access-kqr66\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.456317 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-catalog-content\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.456414 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-utilities\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.489074 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqr66\" (UniqueName: \"kubernetes.io/projected/aa2a143e-ac08-45ee-9c99-bf61bd19a9e5-kube-api-access-kqr66\") pod \"redhat-marketplace-zx6zr\" (UID: \"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5\") " pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.530496 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.555984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e82e33da-9e3c-4236-864b-ef04b7998a89-catalog-content\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.556059 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tk7g\" (UniqueName: \"kubernetes.io/projected/e82e33da-9e3c-4236-864b-ef04b7998a89-kube-api-access-8tk7g\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.556165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e82e33da-9e3c-4236-864b-ef04b7998a89-utilities\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.656992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e82e33da-9e3c-4236-864b-ef04b7998a89-catalog-content\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.657545 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tk7g\" (UniqueName: \"kubernetes.io/projected/e82e33da-9e3c-4236-864b-ef04b7998a89-kube-api-access-8tk7g\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.657583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e82e33da-9e3c-4236-864b-ef04b7998a89-utilities\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.657580 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e82e33da-9e3c-4236-864b-ef04b7998a89-catalog-content\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.657948 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e82e33da-9e3c-4236-864b-ef04b7998a89-utilities\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.683877 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tk7g\" (UniqueName: \"kubernetes.io/projected/e82e33da-9e3c-4236-864b-ef04b7998a89-kube-api-access-8tk7g\") pod \"community-operators-bz646\" (UID: \"e82e33da-9e3c-4236-864b-ef04b7998a89\") " pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.686465 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zx6zr"] Oct 03 12:55:05 crc kubenswrapper[4962]: W1003 12:55:05.689122 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa2a143e_ac08_45ee_9c99_bf61bd19a9e5.slice/crio-6c275059b254b5bd89ae940c2fe04dba3e1f9eefaef88a3c31e570a9f2f5b444 WatchSource:0}: Error finding container 6c275059b254b5bd89ae940c2fe04dba3e1f9eefaef88a3c31e570a9f2f5b444: Status 404 returned error can't find the container with id 6c275059b254b5bd89ae940c2fe04dba3e1f9eefaef88a3c31e570a9f2f5b444 Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.730762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:05 crc kubenswrapper[4962]: I1003 12:55:05.928414 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bz646"] Oct 03 12:55:05 crc kubenswrapper[4962]: W1003 12:55:05.933136 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode82e33da_9e3c_4236_864b_ef04b7998a89.slice/crio-3e7c5aa0672c232ced34d46758baaee87e6b8f8ebb36e9b0a1ed60d9ea55fdcd WatchSource:0}: Error finding container 3e7c5aa0672c232ced34d46758baaee87e6b8f8ebb36e9b0a1ed60d9ea55fdcd: Status 404 returned error can't find the container with id 3e7c5aa0672c232ced34d46758baaee87e6b8f8ebb36e9b0a1ed60d9ea55fdcd Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.236156 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39aab071-afe0-4e3a-b33c-a758f5e1f673" path="/var/lib/kubelet/pods/39aab071-afe0-4e3a-b33c-a758f5e1f673/volumes" Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.238521 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="965c12da-c517-4aa8-b67e-ddbe916b8578" path="/var/lib/kubelet/pods/965c12da-c517-4aa8-b67e-ddbe916b8578/volumes" Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.240313 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5bb7ba-79c7-4251-8025-68e5c9997447" path="/var/lib/kubelet/pods/cb5bb7ba-79c7-4251-8025-68e5c9997447/volumes" Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.242851 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f791e711-ff4d-47b5-aa50-efbf71dc1ac2" path="/var/lib/kubelet/pods/f791e711-ff4d-47b5-aa50-efbf71dc1ac2/volumes" Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.244478 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb2f075-4c59-41c9-b77e-550905415bdb" path="/var/lib/kubelet/pods/fdb2f075-4c59-41c9-b77e-550905415bdb/volumes" Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.347668 4962 generic.go:334] "Generic (PLEG): container finished" podID="e82e33da-9e3c-4236-864b-ef04b7998a89" containerID="a9118e8cd96430b02a82aa327b69010ccb39f43568a947ecc6a55e25f0033809" exitCode=0 Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.347745 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz646" event={"ID":"e82e33da-9e3c-4236-864b-ef04b7998a89","Type":"ContainerDied","Data":"a9118e8cd96430b02a82aa327b69010ccb39f43568a947ecc6a55e25f0033809"} Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.347777 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz646" event={"ID":"e82e33da-9e3c-4236-864b-ef04b7998a89","Type":"ContainerStarted","Data":"3e7c5aa0672c232ced34d46758baaee87e6b8f8ebb36e9b0a1ed60d9ea55fdcd"} Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.350621 4962 generic.go:334] "Generic (PLEG): container finished" podID="aa2a143e-ac08-45ee-9c99-bf61bd19a9e5" containerID="d2391962bea00a51cebbc59cc8d88df00091c3fd218440f7fd1f3ccbe3dd010e" exitCode=0 Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.350683 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6zr" event={"ID":"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5","Type":"ContainerDied","Data":"d2391962bea00a51cebbc59cc8d88df00091c3fd218440f7fd1f3ccbe3dd010e"} Oct 03 12:55:06 crc kubenswrapper[4962]: I1003 12:55:06.350707 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6zr" event={"ID":"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5","Type":"ContainerStarted","Data":"6c275059b254b5bd89ae940c2fe04dba3e1f9eefaef88a3c31e570a9f2f5b444"} Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.600467 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2z5bv"] Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.602630 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.606711 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.621971 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2z5bv"] Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.783110 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-utilities\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.783157 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-catalog-content\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.783192 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp27j\" (UniqueName: \"kubernetes.io/projected/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-kube-api-access-mp27j\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.807164 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zxxpl"] Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.808896 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.810004 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxxpl"] Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.811072 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.884020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-catalog-content\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.884088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp27j\" (UniqueName: \"kubernetes.io/projected/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-kube-api-access-mp27j\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.884195 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-utilities\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.884600 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-catalog-content\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.884834 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-utilities\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.903074 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp27j\" (UniqueName: \"kubernetes.io/projected/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-kube-api-access-mp27j\") pod \"certified-operators-2z5bv\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.925133 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.986019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-catalog-content\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.986077 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkq7b\" (UniqueName: \"kubernetes.io/projected/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-kube-api-access-wkq7b\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:07 crc kubenswrapper[4962]: I1003 12:55:07.986111 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-utilities\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.087878 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-catalog-content\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.087923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkq7b\" (UniqueName: \"kubernetes.io/projected/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-kube-api-access-wkq7b\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.087956 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-utilities\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.089116 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-utilities\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.089130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-catalog-content\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.113078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkq7b\" (UniqueName: \"kubernetes.io/projected/5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55-kube-api-access-wkq7b\") pod \"redhat-operators-zxxpl\" (UID: \"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55\") " pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.114535 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2z5bv"] Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.227187 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.376503 4962 generic.go:334] "Generic (PLEG): container finished" podID="aa2a143e-ac08-45ee-9c99-bf61bd19a9e5" containerID="3df25bf58db6b2f3abaa8ae730e2a56b5b2e3a6a95e863264bba8159b3f8bbde" exitCode=0 Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.376687 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6zr" event={"ID":"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5","Type":"ContainerDied","Data":"3df25bf58db6b2f3abaa8ae730e2a56b5b2e3a6a95e863264bba8159b3f8bbde"} Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.382483 4962 generic.go:334] "Generic (PLEG): container finished" podID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerID="baf9e8456c00e412ca83ab464e8c70091188ef10325a242538314edfdc134f84" exitCode=0 Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.382560 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z5bv" event={"ID":"c0f4fa9d-4551-4ba6-9180-17e8e77325fa","Type":"ContainerDied","Data":"baf9e8456c00e412ca83ab464e8c70091188ef10325a242538314edfdc134f84"} Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.382585 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z5bv" event={"ID":"c0f4fa9d-4551-4ba6-9180-17e8e77325fa","Type":"ContainerStarted","Data":"ff53b410b8984499193531aac895b7613472e80c0d0f852be2dd4afbdb540128"} Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.397253 4962 generic.go:334] "Generic (PLEG): container finished" podID="e82e33da-9e3c-4236-864b-ef04b7998a89" containerID="5c0f627ca4d88faa9798e8446115a0dc459a28fbe6f3d522d6a471f02c0baf7f" exitCode=0 Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.397521 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz646" event={"ID":"e82e33da-9e3c-4236-864b-ef04b7998a89","Type":"ContainerDied","Data":"5c0f627ca4d88faa9798e8446115a0dc459a28fbe6f3d522d6a471f02c0baf7f"} Oct 03 12:55:08 crc kubenswrapper[4962]: I1003 12:55:08.427901 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxxpl"] Oct 03 12:55:08 crc kubenswrapper[4962]: W1003 12:55:08.435681 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e76b6ee_0f7c_4c25_9a2e_20e210d5ec55.slice/crio-d846ac0f8dc0f59139f6bb8fe2857e4af1961a18fea77d63f9c43e6d7d1b795b WatchSource:0}: Error finding container d846ac0f8dc0f59139f6bb8fe2857e4af1961a18fea77d63f9c43e6d7d1b795b: Status 404 returned error can't find the container with id d846ac0f8dc0f59139f6bb8fe2857e4af1961a18fea77d63f9c43e6d7d1b795b Oct 03 12:55:09 crc kubenswrapper[4962]: I1003 12:55:09.403847 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55" containerID="2ae46ce5b452d144e008dfa1dbe13b9b9e19f8f016bc739afc94d598a06c19f1" exitCode=0 Oct 03 12:55:09 crc kubenswrapper[4962]: I1003 12:55:09.403894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxxpl" event={"ID":"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55","Type":"ContainerDied","Data":"2ae46ce5b452d144e008dfa1dbe13b9b9e19f8f016bc739afc94d598a06c19f1"} Oct 03 12:55:09 crc kubenswrapper[4962]: I1003 12:55:09.404214 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxxpl" event={"ID":"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55","Type":"ContainerStarted","Data":"d846ac0f8dc0f59139f6bb8fe2857e4af1961a18fea77d63f9c43e6d7d1b795b"} Oct 03 12:55:09 crc kubenswrapper[4962]: I1003 12:55:09.416916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz646" event={"ID":"e82e33da-9e3c-4236-864b-ef04b7998a89","Type":"ContainerStarted","Data":"230d7d26f476460c63d3b957f28ea25365aa628b4440ddb05b9e417f26817ac7"} Oct 03 12:55:10 crc kubenswrapper[4962]: I1003 12:55:10.424133 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zx6zr" event={"ID":"aa2a143e-ac08-45ee-9c99-bf61bd19a9e5","Type":"ContainerStarted","Data":"0ad019d3b943f6c5a659fe77713ba735fef7817f1562b88473361a496a00f1db"} Oct 03 12:55:10 crc kubenswrapper[4962]: I1003 12:55:10.427935 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z5bv" event={"ID":"c0f4fa9d-4551-4ba6-9180-17e8e77325fa","Type":"ContainerStarted","Data":"3a01873dec59adafe357415b44c295ab8a8aaaed2d18a8e6b85a7bb8ca67509f"} Oct 03 12:55:10 crc kubenswrapper[4962]: I1003 12:55:10.452586 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bz646" podStartSLOduration=2.876866228 podStartE2EDuration="5.452565185s" podCreationTimestamp="2025-10-03 12:55:05 +0000 UTC" firstStartedPulling="2025-10-03 12:55:06.349469314 +0000 UTC m=+314.753367149" lastFinishedPulling="2025-10-03 12:55:08.925168281 +0000 UTC m=+317.329066106" observedRunningTime="2025-10-03 12:55:09.452328319 +0000 UTC m=+317.856226164" watchObservedRunningTime="2025-10-03 12:55:10.452565185 +0000 UTC m=+318.856463020" Oct 03 12:55:10 crc kubenswrapper[4962]: I1003 12:55:10.454546 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zx6zr" podStartSLOduration=2.657676357 podStartE2EDuration="5.454535114s" podCreationTimestamp="2025-10-03 12:55:05 +0000 UTC" firstStartedPulling="2025-10-03 12:55:06.351625639 +0000 UTC m=+314.755523474" lastFinishedPulling="2025-10-03 12:55:09.148484396 +0000 UTC m=+317.552382231" observedRunningTime="2025-10-03 12:55:10.450980517 +0000 UTC m=+318.854878362" watchObservedRunningTime="2025-10-03 12:55:10.454535114 +0000 UTC m=+318.858432949" Oct 03 12:55:11 crc kubenswrapper[4962]: I1003 12:55:11.435256 4962 generic.go:334] "Generic (PLEG): container finished" podID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerID="3a01873dec59adafe357415b44c295ab8a8aaaed2d18a8e6b85a7bb8ca67509f" exitCode=0 Oct 03 12:55:11 crc kubenswrapper[4962]: I1003 12:55:11.435524 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z5bv" event={"ID":"c0f4fa9d-4551-4ba6-9180-17e8e77325fa","Type":"ContainerDied","Data":"3a01873dec59adafe357415b44c295ab8a8aaaed2d18a8e6b85a7bb8ca67509f"} Oct 03 12:55:12 crc kubenswrapper[4962]: I1003 12:55:12.444143 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55" containerID="a2d6897b82a76f3cf9c2c88308627f28d1af405942db9de79ae50130f9230b79" exitCode=0 Oct 03 12:55:12 crc kubenswrapper[4962]: I1003 12:55:12.444231 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxxpl" event={"ID":"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55","Type":"ContainerDied","Data":"a2d6897b82a76f3cf9c2c88308627f28d1af405942db9de79ae50130f9230b79"} Oct 03 12:55:12 crc kubenswrapper[4962]: I1003 12:55:12.447258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z5bv" event={"ID":"c0f4fa9d-4551-4ba6-9180-17e8e77325fa","Type":"ContainerStarted","Data":"2f72dfa4ad78edb0d4605f0307ac773af6cac262ab0801b34fb3adf71ef73b76"} Oct 03 12:55:12 crc kubenswrapper[4962]: I1003 12:55:12.491952 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2z5bv" podStartSLOduration=1.6979726 podStartE2EDuration="5.491929977s" podCreationTimestamp="2025-10-03 12:55:07 +0000 UTC" firstStartedPulling="2025-10-03 12:55:08.385360912 +0000 UTC m=+316.789258737" lastFinishedPulling="2025-10-03 12:55:12.179318279 +0000 UTC m=+320.583216114" observedRunningTime="2025-10-03 12:55:12.4897176 +0000 UTC m=+320.893615455" watchObservedRunningTime="2025-10-03 12:55:12.491929977 +0000 UTC m=+320.895827822" Oct 03 12:55:13 crc kubenswrapper[4962]: I1003 12:55:13.453887 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxxpl" event={"ID":"5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55","Type":"ContainerStarted","Data":"64f759f766dbf3e9a0143d218c6f203b8e104b71e500d06b48b7a0ac88d1d5d6"} Oct 03 12:55:13 crc kubenswrapper[4962]: I1003 12:55:13.477362 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zxxpl" podStartSLOduration=2.837527556 podStartE2EDuration="6.477341695s" podCreationTimestamp="2025-10-03 12:55:07 +0000 UTC" firstStartedPulling="2025-10-03 12:55:09.405200228 +0000 UTC m=+317.809098063" lastFinishedPulling="2025-10-03 12:55:13.045014367 +0000 UTC m=+321.448912202" observedRunningTime="2025-10-03 12:55:13.476214791 +0000 UTC m=+321.880112626" watchObservedRunningTime="2025-10-03 12:55:13.477341695 +0000 UTC m=+321.881239530" Oct 03 12:55:15 crc kubenswrapper[4962]: I1003 12:55:15.531201 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:15 crc kubenswrapper[4962]: I1003 12:55:15.531571 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:15 crc kubenswrapper[4962]: I1003 12:55:15.570931 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:15 crc kubenswrapper[4962]: I1003 12:55:15.731576 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:15 crc kubenswrapper[4962]: I1003 12:55:15.731716 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:15 crc kubenswrapper[4962]: I1003 12:55:15.779858 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:16 crc kubenswrapper[4962]: I1003 12:55:16.503142 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zx6zr" Oct 03 12:55:16 crc kubenswrapper[4962]: I1003 12:55:16.509051 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bz646" Oct 03 12:55:17 crc kubenswrapper[4962]: I1003 12:55:17.925929 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:17 crc kubenswrapper[4962]: I1003 12:55:17.926013 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:17 crc kubenswrapper[4962]: I1003 12:55:17.972933 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:18 crc kubenswrapper[4962]: I1003 12:55:18.232348 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:18 crc kubenswrapper[4962]: I1003 12:55:18.232395 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:18 crc kubenswrapper[4962]: I1003 12:55:18.272861 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:18 crc kubenswrapper[4962]: I1003 12:55:18.515790 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 12:55:18 crc kubenswrapper[4962]: I1003 12:55:18.519607 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zxxpl" Oct 03 12:55:24 crc kubenswrapper[4962]: I1003 12:55:24.660091 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:55:24 crc kubenswrapper[4962]: I1003 12:55:24.660713 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:55:54 crc kubenswrapper[4962]: I1003 12:55:54.660109 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:55:54 crc kubenswrapper[4962]: I1003 12:55:54.661007 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.783906 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lvd7z"] Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.785029 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.806654 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lvd7z"] Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.986622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-registry-tls\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.986738 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjxhm\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-kube-api-access-vjxhm\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.986779 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.986806 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3780a149-9804-47e1-8f8f-5022ba43b834-trusted-ca\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.986835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3780a149-9804-47e1-8f8f-5022ba43b834-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.986868 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-bound-sa-token\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.986925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3780a149-9804-47e1-8f8f-5022ba43b834-registry-certificates\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:56 crc kubenswrapper[4962]: I1003 12:55:56.986949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3780a149-9804-47e1-8f8f-5022ba43b834-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.004931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.088133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3780a149-9804-47e1-8f8f-5022ba43b834-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.088192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-registry-tls\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.088224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjxhm\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-kube-api-access-vjxhm\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.088254 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3780a149-9804-47e1-8f8f-5022ba43b834-trusted-ca\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.088274 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3780a149-9804-47e1-8f8f-5022ba43b834-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.088295 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-bound-sa-token\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.088321 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3780a149-9804-47e1-8f8f-5022ba43b834-registry-certificates\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.089528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3780a149-9804-47e1-8f8f-5022ba43b834-registry-certificates\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.089528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3780a149-9804-47e1-8f8f-5022ba43b834-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.090000 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3780a149-9804-47e1-8f8f-5022ba43b834-trusted-ca\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.095765 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-registry-tls\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.097219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3780a149-9804-47e1-8f8f-5022ba43b834-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.106031 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-bound-sa-token\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.106041 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjxhm\" (UniqueName: \"kubernetes.io/projected/3780a149-9804-47e1-8f8f-5022ba43b834-kube-api-access-vjxhm\") pod \"image-registry-66df7c8f76-lvd7z\" (UID: \"3780a149-9804-47e1-8f8f-5022ba43b834\") " pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.402771 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:57 crc kubenswrapper[4962]: I1003 12:55:57.771968 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lvd7z"] Oct 03 12:55:57 crc kubenswrapper[4962]: W1003 12:55:57.782823 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3780a149_9804_47e1_8f8f_5022ba43b834.slice/crio-a8e13a11400a93539b9023c6bf58a9f6b9d0fd0643fd3b687c7b246b606382f4 WatchSource:0}: Error finding container a8e13a11400a93539b9023c6bf58a9f6b9d0fd0643fd3b687c7b246b606382f4: Status 404 returned error can't find the container with id a8e13a11400a93539b9023c6bf58a9f6b9d0fd0643fd3b687c7b246b606382f4 Oct 03 12:55:58 crc kubenswrapper[4962]: I1003 12:55:58.667223 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" event={"ID":"3780a149-9804-47e1-8f8f-5022ba43b834","Type":"ContainerStarted","Data":"2f8ed5fdf784e584194ad6e428d3d0dab041100a24c3c2a6405405e635c21f3d"} Oct 03 12:55:58 crc kubenswrapper[4962]: I1003 12:55:58.667626 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" event={"ID":"3780a149-9804-47e1-8f8f-5022ba43b834","Type":"ContainerStarted","Data":"a8e13a11400a93539b9023c6bf58a9f6b9d0fd0643fd3b687c7b246b606382f4"} Oct 03 12:55:58 crc kubenswrapper[4962]: I1003 12:55:58.667658 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:55:58 crc kubenswrapper[4962]: I1003 12:55:58.688028 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" podStartSLOduration=2.688006309 podStartE2EDuration="2.688006309s" podCreationTimestamp="2025-10-03 12:55:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:55:58.682711969 +0000 UTC m=+367.086609834" watchObservedRunningTime="2025-10-03 12:55:58.688006309 +0000 UTC m=+367.091904144" Oct 03 12:56:17 crc kubenswrapper[4962]: I1003 12:56:17.408048 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lvd7z" Oct 03 12:56:17 crc kubenswrapper[4962]: I1003 12:56:17.448730 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6pkx8"] Oct 03 12:56:24 crc kubenswrapper[4962]: I1003 12:56:24.659695 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:56:24 crc kubenswrapper[4962]: I1003 12:56:24.660172 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:56:24 crc kubenswrapper[4962]: I1003 12:56:24.660213 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:56:24 crc kubenswrapper[4962]: I1003 12:56:24.660824 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff1cb175dbd5e3b73b2d393894d44cdfd3e2d2b6ac48ceeab83ef956a3efba04"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:56:24 crc kubenswrapper[4962]: I1003 12:56:24.660878 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://ff1cb175dbd5e3b73b2d393894d44cdfd3e2d2b6ac48ceeab83ef956a3efba04" gracePeriod=600 Oct 03 12:56:24 crc kubenswrapper[4962]: I1003 12:56:24.792971 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="ff1cb175dbd5e3b73b2d393894d44cdfd3e2d2b6ac48ceeab83ef956a3efba04" exitCode=0 Oct 03 12:56:24 crc kubenswrapper[4962]: I1003 12:56:24.793011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"ff1cb175dbd5e3b73b2d393894d44cdfd3e2d2b6ac48ceeab83ef956a3efba04"} Oct 03 12:56:24 crc kubenswrapper[4962]: I1003 12:56:24.793047 4962 scope.go:117] "RemoveContainer" containerID="1c066a5dc6399e5df82f7425020e84a5354658a5b77b1014a74126539aa1c738" Oct 03 12:56:25 crc kubenswrapper[4962]: I1003 12:56:25.799134 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"33c84dde92545027ae38429c39abf78238701fc16ece7153e8e1c194f99c81ce"} Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.483272 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" podUID="2982d523-afe6-4ab4-9778-5dbe578a243b" containerName="registry" containerID="cri-o://7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5" gracePeriod=30 Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.773695 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.874620 4962 generic.go:334] "Generic (PLEG): container finished" podID="2982d523-afe6-4ab4-9778-5dbe578a243b" containerID="7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5" exitCode=0 Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.874676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" event={"ID":"2982d523-afe6-4ab4-9778-5dbe578a243b","Type":"ContainerDied","Data":"7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5"} Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.874705 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" event={"ID":"2982d523-afe6-4ab4-9778-5dbe578a243b","Type":"ContainerDied","Data":"1dcc140df02def664d0e949095981c1e5d957726c6a05a2eccf4a45e70645268"} Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.874721 4962 scope.go:117] "RemoveContainer" containerID="7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.874716 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6pkx8" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.889473 4962 scope.go:117] "RemoveContainer" containerID="7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5" Oct 03 12:56:42 crc kubenswrapper[4962]: E1003 12:56:42.889973 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5\": container with ID starting with 7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5 not found: ID does not exist" containerID="7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.890046 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5"} err="failed to get container status \"7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5\": rpc error: code = NotFound desc = could not find container \"7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5\": container with ID starting with 7e277bb34bd1c9c960aa98cf63b4ccf9c529a36af5a8e4236cfa8f64eb2013e5 not found: ID does not exist" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.963492 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m42qn\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-kube-api-access-m42qn\") pod \"2982d523-afe6-4ab4-9778-5dbe578a243b\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.963552 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-bound-sa-token\") pod \"2982d523-afe6-4ab4-9778-5dbe578a243b\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.963589 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-tls\") pod \"2982d523-afe6-4ab4-9778-5dbe578a243b\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.963612 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-trusted-ca\") pod \"2982d523-afe6-4ab4-9778-5dbe578a243b\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.963831 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2982d523-afe6-4ab4-9778-5dbe578a243b-installation-pull-secrets\") pod \"2982d523-afe6-4ab4-9778-5dbe578a243b\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.963891 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2982d523-afe6-4ab4-9778-5dbe578a243b-ca-trust-extracted\") pod \"2982d523-afe6-4ab4-9778-5dbe578a243b\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.963960 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-certificates\") pod \"2982d523-afe6-4ab4-9778-5dbe578a243b\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.964068 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2982d523-afe6-4ab4-9778-5dbe578a243b\" (UID: \"2982d523-afe6-4ab4-9778-5dbe578a243b\") " Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.964786 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2982d523-afe6-4ab4-9778-5dbe578a243b" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.964989 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2982d523-afe6-4ab4-9778-5dbe578a243b" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.969352 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-kube-api-access-m42qn" (OuterVolumeSpecName: "kube-api-access-m42qn") pod "2982d523-afe6-4ab4-9778-5dbe578a243b" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b"). InnerVolumeSpecName "kube-api-access-m42qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.970042 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2982d523-afe6-4ab4-9778-5dbe578a243b" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.984436 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2982d523-afe6-4ab4-9778-5dbe578a243b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2982d523-afe6-4ab4-9778-5dbe578a243b" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.984694 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2982d523-afe6-4ab4-9778-5dbe578a243b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2982d523-afe6-4ab4-9778-5dbe578a243b" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.984909 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2982d523-afe6-4ab4-9778-5dbe578a243b" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:56:42 crc kubenswrapper[4962]: I1003 12:56:42.985187 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2982d523-afe6-4ab4-9778-5dbe578a243b" (UID: "2982d523-afe6-4ab4-9778-5dbe578a243b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.065074 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m42qn\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-kube-api-access-m42qn\") on node \"crc\" DevicePath \"\"" Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.065115 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.065127 4962 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.065137 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.065150 4962 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2982d523-afe6-4ab4-9778-5dbe578a243b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.065161 4962 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2982d523-afe6-4ab4-9778-5dbe578a243b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.065172 4962 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2982d523-afe6-4ab4-9778-5dbe578a243b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.205657 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6pkx8"] Oct 03 12:56:43 crc kubenswrapper[4962]: I1003 12:56:43.209368 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6pkx8"] Oct 03 12:56:44 crc kubenswrapper[4962]: I1003 12:56:44.236560 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2982d523-afe6-4ab4-9778-5dbe578a243b" path="/var/lib/kubelet/pods/2982d523-afe6-4ab4-9778-5dbe578a243b/volumes" Oct 03 12:58:24 crc kubenswrapper[4962]: I1003 12:58:24.660135 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:58:24 crc kubenswrapper[4962]: I1003 12:58:24.660770 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:58:54 crc kubenswrapper[4962]: I1003 12:58:54.660013 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:58:54 crc kubenswrapper[4962]: I1003 12:58:54.660585 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:59:24 crc kubenswrapper[4962]: I1003 12:59:24.660017 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:59:24 crc kubenswrapper[4962]: I1003 12:59:24.660571 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:59:24 crc kubenswrapper[4962]: I1003 12:59:24.660615 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 12:59:24 crc kubenswrapper[4962]: I1003 12:59:24.661335 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33c84dde92545027ae38429c39abf78238701fc16ece7153e8e1c194f99c81ce"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:59:24 crc kubenswrapper[4962]: I1003 12:59:24.661392 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://33c84dde92545027ae38429c39abf78238701fc16ece7153e8e1c194f99c81ce" gracePeriod=600 Oct 03 12:59:25 crc kubenswrapper[4962]: I1003 12:59:25.695152 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="33c84dde92545027ae38429c39abf78238701fc16ece7153e8e1c194f99c81ce" exitCode=0 Oct 03 12:59:25 crc kubenswrapper[4962]: I1003 12:59:25.695249 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"33c84dde92545027ae38429c39abf78238701fc16ece7153e8e1c194f99c81ce"} Oct 03 12:59:25 crc kubenswrapper[4962]: I1003 12:59:25.695729 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"1f2cdc13b11da65d799c535afd6e9add75247936ceedca31f891b7fb2d791205"} Oct 03 12:59:25 crc kubenswrapper[4962]: I1003 12:59:25.695750 4962 scope.go:117] "RemoveContainer" containerID="ff1cb175dbd5e3b73b2d393894d44cdfd3e2d2b6ac48ceeab83ef956a3efba04" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.139883 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr"] Oct 03 13:00:00 crc kubenswrapper[4962]: E1003 13:00:00.140672 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2982d523-afe6-4ab4-9778-5dbe578a243b" containerName="registry" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.140690 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2982d523-afe6-4ab4-9778-5dbe578a243b" containerName="registry" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.140815 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2982d523-afe6-4ab4-9778-5dbe578a243b" containerName="registry" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.141429 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.144392 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.144621 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.155772 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr"] Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.159728 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a581eeb-fbed-4cb2-bd7c-514596ca72df-secret-volume\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.159788 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a581eeb-fbed-4cb2-bd7c-514596ca72df-config-volume\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.160018 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jvfd\" (UniqueName: \"kubernetes.io/projected/9a581eeb-fbed-4cb2-bd7c-514596ca72df-kube-api-access-8jvfd\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.261171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jvfd\" (UniqueName: \"kubernetes.io/projected/9a581eeb-fbed-4cb2-bd7c-514596ca72df-kube-api-access-8jvfd\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.261312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a581eeb-fbed-4cb2-bd7c-514596ca72df-secret-volume\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.261356 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a581eeb-fbed-4cb2-bd7c-514596ca72df-config-volume\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.262436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a581eeb-fbed-4cb2-bd7c-514596ca72df-config-volume\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.271468 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a581eeb-fbed-4cb2-bd7c-514596ca72df-secret-volume\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.277507 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jvfd\" (UniqueName: \"kubernetes.io/projected/9a581eeb-fbed-4cb2-bd7c-514596ca72df-kube-api-access-8jvfd\") pod \"collect-profiles-29324940-khbtr\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.476764 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.644410 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr"] Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.888902 4962 generic.go:334] "Generic (PLEG): container finished" podID="9a581eeb-fbed-4cb2-bd7c-514596ca72df" containerID="96d804cafd4e44896c85fefafec8011144b1e21e0c46c72f9957d9ef76fdf8f3" exitCode=0 Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.888941 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" event={"ID":"9a581eeb-fbed-4cb2-bd7c-514596ca72df","Type":"ContainerDied","Data":"96d804cafd4e44896c85fefafec8011144b1e21e0c46c72f9957d9ef76fdf8f3"} Oct 03 13:00:00 crc kubenswrapper[4962]: I1003 13:00:00.888967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" event={"ID":"9a581eeb-fbed-4cb2-bd7c-514596ca72df","Type":"ContainerStarted","Data":"2626df0b2cbcbd081d26a575d78903861a3b5a5c2a0eb06f83a625ae943977cc"} Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.090811 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.286005 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a581eeb-fbed-4cb2-bd7c-514596ca72df-secret-volume\") pod \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.286321 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a581eeb-fbed-4cb2-bd7c-514596ca72df-config-volume\") pod \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.286537 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jvfd\" (UniqueName: \"kubernetes.io/projected/9a581eeb-fbed-4cb2-bd7c-514596ca72df-kube-api-access-8jvfd\") pod \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\" (UID: \"9a581eeb-fbed-4cb2-bd7c-514596ca72df\") " Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.287289 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a581eeb-fbed-4cb2-bd7c-514596ca72df-config-volume" (OuterVolumeSpecName: "config-volume") pod "9a581eeb-fbed-4cb2-bd7c-514596ca72df" (UID: "9a581eeb-fbed-4cb2-bd7c-514596ca72df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.287582 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a581eeb-fbed-4cb2-bd7c-514596ca72df-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.291593 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a581eeb-fbed-4cb2-bd7c-514596ca72df-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9a581eeb-fbed-4cb2-bd7c-514596ca72df" (UID: "9a581eeb-fbed-4cb2-bd7c-514596ca72df"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.291952 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a581eeb-fbed-4cb2-bd7c-514596ca72df-kube-api-access-8jvfd" (OuterVolumeSpecName: "kube-api-access-8jvfd") pod "9a581eeb-fbed-4cb2-bd7c-514596ca72df" (UID: "9a581eeb-fbed-4cb2-bd7c-514596ca72df"). InnerVolumeSpecName "kube-api-access-8jvfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.388599 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jvfd\" (UniqueName: \"kubernetes.io/projected/9a581eeb-fbed-4cb2-bd7c-514596ca72df-kube-api-access-8jvfd\") on node \"crc\" DevicePath \"\"" Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.388648 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a581eeb-fbed-4cb2-bd7c-514596ca72df-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.899307 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" event={"ID":"9a581eeb-fbed-4cb2-bd7c-514596ca72df","Type":"ContainerDied","Data":"2626df0b2cbcbd081d26a575d78903861a3b5a5c2a0eb06f83a625ae943977cc"} Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.899347 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2626df0b2cbcbd081d26a575d78903861a3b5a5c2a0eb06f83a625ae943977cc" Oct 03 13:00:02 crc kubenswrapper[4962]: I1003 13:00:02.899382 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr" Oct 03 13:01:24 crc kubenswrapper[4962]: I1003 13:01:24.659783 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:01:24 crc kubenswrapper[4962]: I1003 13:01:24.660556 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.536228 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5zwj"] Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.536977 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" podUID="9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" containerName="controller-manager" containerID="cri-o://d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47" gracePeriod=30 Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.632215 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb"] Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.632708 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" podUID="398018f7-8c31-40f9-bd6a-170564176a58" containerName="route-controller-manager" containerID="cri-o://42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979" gracePeriod=30 Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.881625 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.950502 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.983889 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-proxy-ca-bundles\") pod \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.983977 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-serving-cert\") pod \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.984082 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtzcf\" (UniqueName: \"kubernetes.io/projected/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-kube-api-access-gtzcf\") pod \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.984106 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-config\") pod \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.984136 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-client-ca\") pod \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\" (UID: \"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c\") " Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.985149 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" (UID: "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.985181 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" (UID: "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.985383 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-config" (OuterVolumeSpecName: "config") pod "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" (UID: "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.989694 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-kube-api-access-gtzcf" (OuterVolumeSpecName: "kube-api-access-gtzcf") pod "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" (UID: "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c"). InnerVolumeSpecName "kube-api-access-gtzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:01:53 crc kubenswrapper[4962]: I1003 13:01:53.989696 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" (UID: "9c7cf52e-ea38-43a7-bd33-f546b4d5f57c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085155 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/398018f7-8c31-40f9-bd6a-170564176a58-serving-cert\") pod \"398018f7-8c31-40f9-bd6a-170564176a58\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085266 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-config\") pod \"398018f7-8c31-40f9-bd6a-170564176a58\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-client-ca\") pod \"398018f7-8c31-40f9-bd6a-170564176a58\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nll5\" (UniqueName: \"kubernetes.io/projected/398018f7-8c31-40f9-bd6a-170564176a58-kube-api-access-9nll5\") pod \"398018f7-8c31-40f9-bd6a-170564176a58\" (UID: \"398018f7-8c31-40f9-bd6a-170564176a58\") " Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085761 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085789 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtzcf\" (UniqueName: \"kubernetes.io/projected/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-kube-api-access-gtzcf\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085801 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085809 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085819 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085930 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-config" (OuterVolumeSpecName: "config") pod "398018f7-8c31-40f9-bd6a-170564176a58" (UID: "398018f7-8c31-40f9-bd6a-170564176a58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.085962 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-client-ca" (OuterVolumeSpecName: "client-ca") pod "398018f7-8c31-40f9-bd6a-170564176a58" (UID: "398018f7-8c31-40f9-bd6a-170564176a58"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.088673 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398018f7-8c31-40f9-bd6a-170564176a58-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "398018f7-8c31-40f9-bd6a-170564176a58" (UID: "398018f7-8c31-40f9-bd6a-170564176a58"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.089205 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398018f7-8c31-40f9-bd6a-170564176a58-kube-api-access-9nll5" (OuterVolumeSpecName: "kube-api-access-9nll5") pod "398018f7-8c31-40f9-bd6a-170564176a58" (UID: "398018f7-8c31-40f9-bd6a-170564176a58"). InnerVolumeSpecName "kube-api-access-9nll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.148365 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7696c64667-z9tk5"] Oct 03 13:01:54 crc kubenswrapper[4962]: E1003 13:01:54.148774 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398018f7-8c31-40f9-bd6a-170564176a58" containerName="route-controller-manager" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.148835 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="398018f7-8c31-40f9-bd6a-170564176a58" containerName="route-controller-manager" Oct 03 13:01:54 crc kubenswrapper[4962]: E1003 13:01:54.148883 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" containerName="controller-manager" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.148930 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" containerName="controller-manager" Oct 03 13:01:54 crc kubenswrapper[4962]: E1003 13:01:54.149003 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a581eeb-fbed-4cb2-bd7c-514596ca72df" containerName="collect-profiles" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.149055 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a581eeb-fbed-4cb2-bd7c-514596ca72df" containerName="collect-profiles" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.149189 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" containerName="controller-manager" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.149250 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="398018f7-8c31-40f9-bd6a-170564176a58" containerName="route-controller-manager" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.149311 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a581eeb-fbed-4cb2-bd7c-514596ca72df" containerName="collect-profiles" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.149726 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.161083 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7696c64667-z9tk5"] Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.180136 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd"] Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.180996 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.187049 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.187254 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/398018f7-8c31-40f9-bd6a-170564176a58-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.187309 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nll5\" (UniqueName: \"kubernetes.io/projected/398018f7-8c31-40f9-bd6a-170564176a58-kube-api-access-9nll5\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.187362 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/398018f7-8c31-40f9-bd6a-170564176a58-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.200606 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd"] Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccfaf312-eda3-414d-a33b-cd5027782c32-serving-cert\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288224 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2s6s\" (UniqueName: \"kubernetes.io/projected/ccfaf312-eda3-414d-a33b-cd5027782c32-kube-api-access-m2s6s\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288262 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-config\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288283 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-client-ca\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288305 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/161f83bf-e749-4bd1-98cb-968b4dd8224a-serving-cert\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/161f83bf-e749-4bd1-98cb-968b4dd8224a-config\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288347 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmm6g\" (UniqueName: \"kubernetes.io/projected/161f83bf-e749-4bd1-98cb-968b4dd8224a-kube-api-access-cmm6g\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/161f83bf-e749-4bd1-98cb-968b4dd8224a-client-ca\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.288388 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-proxy-ca-bundles\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.389812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmm6g\" (UniqueName: \"kubernetes.io/projected/161f83bf-e749-4bd1-98cb-968b4dd8224a-kube-api-access-cmm6g\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.389866 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/161f83bf-e749-4bd1-98cb-968b4dd8224a-client-ca\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.389892 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-proxy-ca-bundles\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.389919 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccfaf312-eda3-414d-a33b-cd5027782c32-serving-cert\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.389940 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2s6s\" (UniqueName: \"kubernetes.io/projected/ccfaf312-eda3-414d-a33b-cd5027782c32-kube-api-access-m2s6s\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.389967 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-config\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.389986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-client-ca\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.390007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/161f83bf-e749-4bd1-98cb-968b4dd8224a-serving-cert\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.390025 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/161f83bf-e749-4bd1-98cb-968b4dd8224a-config\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.391233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/161f83bf-e749-4bd1-98cb-968b4dd8224a-config\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.392099 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/161f83bf-e749-4bd1-98cb-968b4dd8224a-client-ca\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.392988 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-proxy-ca-bundles\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.393499 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-client-ca\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.393965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfaf312-eda3-414d-a33b-cd5027782c32-config\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.396278 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccfaf312-eda3-414d-a33b-cd5027782c32-serving-cert\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.396341 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/161f83bf-e749-4bd1-98cb-968b4dd8224a-serving-cert\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.411344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmm6g\" (UniqueName: \"kubernetes.io/projected/161f83bf-e749-4bd1-98cb-968b4dd8224a-kube-api-access-cmm6g\") pod \"route-controller-manager-6d479dc49-hsbzd\" (UID: \"161f83bf-e749-4bd1-98cb-968b4dd8224a\") " pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.414874 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2s6s\" (UniqueName: \"kubernetes.io/projected/ccfaf312-eda3-414d-a33b-cd5027782c32-kube-api-access-m2s6s\") pod \"controller-manager-7696c64667-z9tk5\" (UID: \"ccfaf312-eda3-414d-a33b-cd5027782c32\") " pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.453261 4962 generic.go:334] "Generic (PLEG): container finished" podID="398018f7-8c31-40f9-bd6a-170564176a58" containerID="42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979" exitCode=0 Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.453334 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.453322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" event={"ID":"398018f7-8c31-40f9-bd6a-170564176a58","Type":"ContainerDied","Data":"42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979"} Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.453463 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb" event={"ID":"398018f7-8c31-40f9-bd6a-170564176a58","Type":"ContainerDied","Data":"960510cddc8a8ebed6a70ad7dc3bc460d3790f0a06fc6157a6807cd6c945951d"} Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.453531 4962 scope.go:117] "RemoveContainer" containerID="42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.455128 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" containerID="d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47" exitCode=0 Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.455165 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.455176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" event={"ID":"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c","Type":"ContainerDied","Data":"d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47"} Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.455200 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p5zwj" event={"ID":"9c7cf52e-ea38-43a7-bd33-f546b4d5f57c","Type":"ContainerDied","Data":"4fa2507ffc77520ae604b1481020b41432a465d5250ae855a18b722d4478f3b7"} Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.465989 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.469155 4962 scope.go:117] "RemoveContainer" containerID="42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979" Oct 03 13:01:54 crc kubenswrapper[4962]: E1003 13:01:54.469491 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979\": container with ID starting with 42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979 not found: ID does not exist" containerID="42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.469530 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979"} err="failed to get container status \"42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979\": rpc error: code = NotFound desc = could not find container \"42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979\": container with ID starting with 42510e9ff05104233b4492ce06da77827c1819b6006c611c7f9c431e2b590979 not found: ID does not exist" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.469563 4962 scope.go:117] "RemoveContainer" containerID="d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.472396 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb"] Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.481845 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdkb"] Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.487321 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5zwj"] Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.490835 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5zwj"] Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.492805 4962 scope.go:117] "RemoveContainer" containerID="d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47" Oct 03 13:01:54 crc kubenswrapper[4962]: E1003 13:01:54.493854 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47\": container with ID starting with d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47 not found: ID does not exist" containerID="d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.493887 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47"} err="failed to get container status \"d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47\": rpc error: code = NotFound desc = could not find container \"d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47\": container with ID starting with d92c2e6777babd0fa498684e08fd9baa005c943f7cae64a55ea88f34fb85ec47 not found: ID does not exist" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.496003 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.660965 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.661317 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.676300 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7696c64667-z9tk5"] Oct 03 13:01:54 crc kubenswrapper[4962]: I1003 13:01:54.739325 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd"] Oct 03 13:01:54 crc kubenswrapper[4962]: W1003 13:01:54.745853 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod161f83bf_e749_4bd1_98cb_968b4dd8224a.slice/crio-fca478c9962dfb0b962e4236fd1cf0b1a6257958bed5d1b3d4c4941c8021238f WatchSource:0}: Error finding container fca478c9962dfb0b962e4236fd1cf0b1a6257958bed5d1b3d4c4941c8021238f: Status 404 returned error can't find the container with id fca478c9962dfb0b962e4236fd1cf0b1a6257958bed5d1b3d4c4941c8021238f Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.463454 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" event={"ID":"ccfaf312-eda3-414d-a33b-cd5027782c32","Type":"ContainerStarted","Data":"cfb8eb20e9001cf59e44ccf447c87d51c896f65a5631b425a4666c385030abe4"} Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.463850 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.463869 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" event={"ID":"ccfaf312-eda3-414d-a33b-cd5027782c32","Type":"ContainerStarted","Data":"4fc7ccfe483a2e212a1580a5134cf6d1ba784975222b72880c5ef4e921a2a1e8"} Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.465718 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" event={"ID":"161f83bf-e749-4bd1-98cb-968b4dd8224a","Type":"ContainerStarted","Data":"8aef83651a8efc8827a3aa45abccc9d607204f3ca49b0beffc304c35b85ee5d8"} Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.465752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" event={"ID":"161f83bf-e749-4bd1-98cb-968b4dd8224a","Type":"ContainerStarted","Data":"fca478c9962dfb0b962e4236fd1cf0b1a6257958bed5d1b3d4c4941c8021238f"} Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.465944 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.482250 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.492928 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.498250 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7696c64667-z9tk5" podStartSLOduration=1.498238444 podStartE2EDuration="1.498238444s" podCreationTimestamp="2025-10-03 13:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:01:55.495293768 +0000 UTC m=+723.899191603" watchObservedRunningTime="2025-10-03 13:01:55.498238444 +0000 UTC m=+723.902136279" Oct 03 13:01:55 crc kubenswrapper[4962]: I1003 13:01:55.512796 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d479dc49-hsbzd" podStartSLOduration=1.512776207 podStartE2EDuration="1.512776207s" podCreationTimestamp="2025-10-03 13:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:01:55.512207172 +0000 UTC m=+723.916105017" watchObservedRunningTime="2025-10-03 13:01:55.512776207 +0000 UTC m=+723.916674052" Oct 03 13:01:56 crc kubenswrapper[4962]: I1003 13:01:56.233145 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398018f7-8c31-40f9-bd6a-170564176a58" path="/var/lib/kubelet/pods/398018f7-8c31-40f9-bd6a-170564176a58/volumes" Oct 03 13:01:56 crc kubenswrapper[4962]: I1003 13:01:56.233925 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7cf52e-ea38-43a7-bd33-f546b4d5f57c" path="/var/lib/kubelet/pods/9c7cf52e-ea38-43a7-bd33-f546b4d5f57c/volumes" Oct 03 13:02:04 crc kubenswrapper[4962]: I1003 13:02:04.335159 4962 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.162237 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9zccd"] Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.166301 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.184490 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zccd"] Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.279142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8fdbf9f-0043-4acd-9221-4aafad299c55-catalog-content\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.279202 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8fdbf9f-0043-4acd-9221-4aafad299c55-utilities\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.279252 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68tdg\" (UniqueName: \"kubernetes.io/projected/d8fdbf9f-0043-4acd-9221-4aafad299c55-kube-api-access-68tdg\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.380436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8fdbf9f-0043-4acd-9221-4aafad299c55-catalog-content\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.380491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8fdbf9f-0043-4acd-9221-4aafad299c55-utilities\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.380517 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68tdg\" (UniqueName: \"kubernetes.io/projected/d8fdbf9f-0043-4acd-9221-4aafad299c55-kube-api-access-68tdg\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.381362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8fdbf9f-0043-4acd-9221-4aafad299c55-catalog-content\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.381571 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8fdbf9f-0043-4acd-9221-4aafad299c55-utilities\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.403520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68tdg\" (UniqueName: \"kubernetes.io/projected/d8fdbf9f-0043-4acd-9221-4aafad299c55-kube-api-access-68tdg\") pod \"certified-operators-9zccd\" (UID: \"d8fdbf9f-0043-4acd-9221-4aafad299c55\") " pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.534722 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:16 crc kubenswrapper[4962]: I1003 13:02:16.963872 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zccd"] Oct 03 13:02:17 crc kubenswrapper[4962]: I1003 13:02:17.586252 4962 generic.go:334] "Generic (PLEG): container finished" podID="d8fdbf9f-0043-4acd-9221-4aafad299c55" containerID="aa4fd5508f4befe21fee0d88fda42dff310619f0bfbff42bf037ccf6ba2845d3" exitCode=0 Oct 03 13:02:17 crc kubenswrapper[4962]: I1003 13:02:17.586290 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zccd" event={"ID":"d8fdbf9f-0043-4acd-9221-4aafad299c55","Type":"ContainerDied","Data":"aa4fd5508f4befe21fee0d88fda42dff310619f0bfbff42bf037ccf6ba2845d3"} Oct 03 13:02:17 crc kubenswrapper[4962]: I1003 13:02:17.586312 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zccd" event={"ID":"d8fdbf9f-0043-4acd-9221-4aafad299c55","Type":"ContainerStarted","Data":"1c27fe2df496e6c114ee40d4d427e886c15ee1e346be30578b656a74eb50ff21"} Oct 03 13:02:17 crc kubenswrapper[4962]: I1003 13:02:17.589523 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:02:21 crc kubenswrapper[4962]: I1003 13:02:21.613999 4962 generic.go:334] "Generic (PLEG): container finished" podID="d8fdbf9f-0043-4acd-9221-4aafad299c55" containerID="3ce837a1e67bb0b1e5a63ba740e085c9682e9e787e73d1fd4cf55f31628de9ce" exitCode=0 Oct 03 13:02:21 crc kubenswrapper[4962]: I1003 13:02:21.614102 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zccd" event={"ID":"d8fdbf9f-0043-4acd-9221-4aafad299c55","Type":"ContainerDied","Data":"3ce837a1e67bb0b1e5a63ba740e085c9682e9e787e73d1fd4cf55f31628de9ce"} Oct 03 13:02:23 crc kubenswrapper[4962]: I1003 13:02:23.625615 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zccd" event={"ID":"d8fdbf9f-0043-4acd-9221-4aafad299c55","Type":"ContainerStarted","Data":"58af617a712edd3b2da056969ba19ac4f79fafde65527e59bfdf391963f1e661"} Oct 03 13:02:23 crc kubenswrapper[4962]: I1003 13:02:23.643987 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9zccd" podStartSLOduration=2.976939792 podStartE2EDuration="7.643771341s" podCreationTimestamp="2025-10-03 13:02:16 +0000 UTC" firstStartedPulling="2025-10-03 13:02:17.589222297 +0000 UTC m=+745.993120132" lastFinishedPulling="2025-10-03 13:02:22.256053846 +0000 UTC m=+750.659951681" observedRunningTime="2025-10-03 13:02:23.640019194 +0000 UTC m=+752.043917029" watchObservedRunningTime="2025-10-03 13:02:23.643771341 +0000 UTC m=+752.047669176" Oct 03 13:02:24 crc kubenswrapper[4962]: I1003 13:02:24.660258 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:02:24 crc kubenswrapper[4962]: I1003 13:02:24.660331 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:02:24 crc kubenswrapper[4962]: I1003 13:02:24.660373 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:02:24 crc kubenswrapper[4962]: I1003 13:02:24.660982 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f2cdc13b11da65d799c535afd6e9add75247936ceedca31f891b7fb2d791205"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:02:24 crc kubenswrapper[4962]: I1003 13:02:24.661026 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://1f2cdc13b11da65d799c535afd6e9add75247936ceedca31f891b7fb2d791205" gracePeriod=600 Oct 03 13:02:25 crc kubenswrapper[4962]: I1003 13:02:25.639470 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="1f2cdc13b11da65d799c535afd6e9add75247936ceedca31f891b7fb2d791205" exitCode=0 Oct 03 13:02:25 crc kubenswrapper[4962]: I1003 13:02:25.639552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"1f2cdc13b11da65d799c535afd6e9add75247936ceedca31f891b7fb2d791205"} Oct 03 13:02:25 crc kubenswrapper[4962]: I1003 13:02:25.640093 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"076795ece51777028d4903d02e01c23aad08fd0c510374d97a2d753df68d0eea"} Oct 03 13:02:25 crc kubenswrapper[4962]: I1003 13:02:25.640116 4962 scope.go:117] "RemoveContainer" containerID="33c84dde92545027ae38429c39abf78238701fc16ece7153e8e1c194f99c81ce" Oct 03 13:02:26 crc kubenswrapper[4962]: I1003 13:02:26.535681 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:26 crc kubenswrapper[4962]: I1003 13:02:26.536827 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:26 crc kubenswrapper[4962]: I1003 13:02:26.576349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:29 crc kubenswrapper[4962]: I1003 13:02:29.984341 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5xcn"] Oct 03 13:02:29 crc kubenswrapper[4962]: I1003 13:02:29.985653 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:29 crc kubenswrapper[4962]: I1003 13:02:29.996931 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5xcn"] Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.044204 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-utilities\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.044282 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwkjd\" (UniqueName: \"kubernetes.io/projected/e89189cb-39da-4eac-a4be-c303b37b8447-kube-api-access-kwkjd\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.044323 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-catalog-content\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.145505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-catalog-content\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.145677 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-utilities\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.145714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwkjd\" (UniqueName: \"kubernetes.io/projected/e89189cb-39da-4eac-a4be-c303b37b8447-kube-api-access-kwkjd\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.146376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-catalog-content\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.146580 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-utilities\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.165648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwkjd\" (UniqueName: \"kubernetes.io/projected/e89189cb-39da-4eac-a4be-c303b37b8447-kube-api-access-kwkjd\") pod \"redhat-operators-w5xcn\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.301548 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:30 crc kubenswrapper[4962]: I1003 13:02:30.716800 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5xcn"] Oct 03 13:02:31 crc kubenswrapper[4962]: I1003 13:02:31.671530 4962 generic.go:334] "Generic (PLEG): container finished" podID="e89189cb-39da-4eac-a4be-c303b37b8447" containerID="a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf" exitCode=0 Oct 03 13:02:31 crc kubenswrapper[4962]: I1003 13:02:31.671571 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5xcn" event={"ID":"e89189cb-39da-4eac-a4be-c303b37b8447","Type":"ContainerDied","Data":"a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf"} Oct 03 13:02:31 crc kubenswrapper[4962]: I1003 13:02:31.671606 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5xcn" event={"ID":"e89189cb-39da-4eac-a4be-c303b37b8447","Type":"ContainerStarted","Data":"0e769749ebab05aea9e36c7ee193d834b40c2328c11630b3c06bb538b8db7c04"} Oct 03 13:02:32 crc kubenswrapper[4962]: I1003 13:02:32.677841 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5xcn" event={"ID":"e89189cb-39da-4eac-a4be-c303b37b8447","Type":"ContainerStarted","Data":"19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b"} Oct 03 13:02:33 crc kubenswrapper[4962]: I1003 13:02:33.683726 4962 generic.go:334] "Generic (PLEG): container finished" podID="e89189cb-39da-4eac-a4be-c303b37b8447" containerID="19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b" exitCode=0 Oct 03 13:02:33 crc kubenswrapper[4962]: I1003 13:02:33.683776 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5xcn" event={"ID":"e89189cb-39da-4eac-a4be-c303b37b8447","Type":"ContainerDied","Data":"19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b"} Oct 03 13:02:34 crc kubenswrapper[4962]: I1003 13:02:34.691108 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5xcn" event={"ID":"e89189cb-39da-4eac-a4be-c303b37b8447","Type":"ContainerStarted","Data":"61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b"} Oct 03 13:02:34 crc kubenswrapper[4962]: I1003 13:02:34.709743 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5xcn" podStartSLOduration=3.172354058 podStartE2EDuration="5.70972525s" podCreationTimestamp="2025-10-03 13:02:29 +0000 UTC" firstStartedPulling="2025-10-03 13:02:31.674317206 +0000 UTC m=+760.078215041" lastFinishedPulling="2025-10-03 13:02:34.211688398 +0000 UTC m=+762.615586233" observedRunningTime="2025-10-03 13:02:34.70661226 +0000 UTC m=+763.110510105" watchObservedRunningTime="2025-10-03 13:02:34.70972525 +0000 UTC m=+763.113623085" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.373564 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6cv8l"] Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.375066 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.385799 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cv8l"] Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.419062 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-utilities\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.419266 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpkf2\" (UniqueName: \"kubernetes.io/projected/55525ea2-d968-432b-b6fc-d4630a9e843a-kube-api-access-cpkf2\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.419319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-catalog-content\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.521018 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpkf2\" (UniqueName: \"kubernetes.io/projected/55525ea2-d968-432b-b6fc-d4630a9e843a-kube-api-access-cpkf2\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.521081 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-catalog-content\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.521135 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-utilities\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.521801 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-catalog-content\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.521862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-utilities\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.542712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpkf2\" (UniqueName: \"kubernetes.io/projected/55525ea2-d968-432b-b6fc-d4630a9e843a-kube-api-access-cpkf2\") pod \"redhat-marketplace-6cv8l\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.587761 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9zccd" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.688436 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:36 crc kubenswrapper[4962]: I1003 13:02:36.874907 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cv8l"] Oct 03 13:02:36 crc kubenswrapper[4962]: W1003 13:02:36.897756 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55525ea2_d968_432b_b6fc_d4630a9e843a.slice/crio-bacede4c6aa394286d1ee451d21e8c0662b3c88e15ef81ebf1d3b77fc111a52b WatchSource:0}: Error finding container bacede4c6aa394286d1ee451d21e8c0662b3c88e15ef81ebf1d3b77fc111a52b: Status 404 returned error can't find the container with id bacede4c6aa394286d1ee451d21e8c0662b3c88e15ef81ebf1d3b77fc111a52b Oct 03 13:02:37 crc kubenswrapper[4962]: I1003 13:02:37.706192 4962 generic.go:334] "Generic (PLEG): container finished" podID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerID="f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93" exitCode=0 Oct 03 13:02:37 crc kubenswrapper[4962]: I1003 13:02:37.706291 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cv8l" event={"ID":"55525ea2-d968-432b-b6fc-d4630a9e843a","Type":"ContainerDied","Data":"f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93"} Oct 03 13:02:37 crc kubenswrapper[4962]: I1003 13:02:37.706525 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cv8l" event={"ID":"55525ea2-d968-432b-b6fc-d4630a9e843a","Type":"ContainerStarted","Data":"bacede4c6aa394286d1ee451d21e8c0662b3c88e15ef81ebf1d3b77fc111a52b"} Oct 03 13:02:38 crc kubenswrapper[4962]: I1003 13:02:38.626258 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zccd"] Oct 03 13:02:38 crc kubenswrapper[4962]: I1003 13:02:38.713595 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cv8l" event={"ID":"55525ea2-d968-432b-b6fc-d4630a9e843a","Type":"ContainerStarted","Data":"602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758"} Oct 03 13:02:38 crc kubenswrapper[4962]: I1003 13:02:38.965758 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2z5bv"] Oct 03 13:02:38 crc kubenswrapper[4962]: I1003 13:02:38.966009 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2z5bv" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerName="registry-server" containerID="cri-o://2f72dfa4ad78edb0d4605f0307ac773af6cac262ab0801b34fb3adf71ef73b76" gracePeriod=2 Oct 03 13:02:40 crc kubenswrapper[4962]: I1003 13:02:40.302605 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:40 crc kubenswrapper[4962]: I1003 13:02:40.302980 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:40 crc kubenswrapper[4962]: I1003 13:02:40.338007 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:40 crc kubenswrapper[4962]: I1003 13:02:40.724360 4962 generic.go:334] "Generic (PLEG): container finished" podID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerID="602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758" exitCode=0 Oct 03 13:02:40 crc kubenswrapper[4962]: I1003 13:02:40.725005 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cv8l" event={"ID":"55525ea2-d968-432b-b6fc-d4630a9e843a","Type":"ContainerDied","Data":"602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758"} Oct 03 13:02:40 crc kubenswrapper[4962]: I1003 13:02:40.764886 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:41 crc kubenswrapper[4962]: I1003 13:02:41.733308 4962 generic.go:334] "Generic (PLEG): container finished" podID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerID="2f72dfa4ad78edb0d4605f0307ac773af6cac262ab0801b34fb3adf71ef73b76" exitCode=0 Oct 03 13:02:41 crc kubenswrapper[4962]: I1003 13:02:41.733393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z5bv" event={"ID":"c0f4fa9d-4551-4ba6-9180-17e8e77325fa","Type":"ContainerDied","Data":"2f72dfa4ad78edb0d4605f0307ac773af6cac262ab0801b34fb3adf71ef73b76"} Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.200618 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.286569 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-utilities\") pod \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.286682 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp27j\" (UniqueName: \"kubernetes.io/projected/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-kube-api-access-mp27j\") pod \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.286755 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-catalog-content\") pod \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\" (UID: \"c0f4fa9d-4551-4ba6-9180-17e8e77325fa\") " Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.287749 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-utilities" (OuterVolumeSpecName: "utilities") pod "c0f4fa9d-4551-4ba6-9180-17e8e77325fa" (UID: "c0f4fa9d-4551-4ba6-9180-17e8e77325fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.291301 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-kube-api-access-mp27j" (OuterVolumeSpecName: "kube-api-access-mp27j") pod "c0f4fa9d-4551-4ba6-9180-17e8e77325fa" (UID: "c0f4fa9d-4551-4ba6-9180-17e8e77325fa"). InnerVolumeSpecName "kube-api-access-mp27j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.329848 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0f4fa9d-4551-4ba6-9180-17e8e77325fa" (UID: "c0f4fa9d-4551-4ba6-9180-17e8e77325fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.387723 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.387766 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp27j\" (UniqueName: \"kubernetes.io/projected/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-kube-api-access-mp27j\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.387781 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f4fa9d-4551-4ba6-9180-17e8e77325fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.738429 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z5bv" event={"ID":"c0f4fa9d-4551-4ba6-9180-17e8e77325fa","Type":"ContainerDied","Data":"ff53b410b8984499193531aac895b7613472e80c0d0f852be2dd4afbdb540128"} Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.738823 4962 scope.go:117] "RemoveContainer" containerID="2f72dfa4ad78edb0d4605f0307ac773af6cac262ab0801b34fb3adf71ef73b76" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.738459 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z5bv" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.740534 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cv8l" event={"ID":"55525ea2-d968-432b-b6fc-d4630a9e843a","Type":"ContainerStarted","Data":"05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984"} Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.761022 4962 scope.go:117] "RemoveContainer" containerID="3a01873dec59adafe357415b44c295ab8a8aaaed2d18a8e6b85a7bb8ca67509f" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.760623 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6cv8l" podStartSLOduration=2.860697579 podStartE2EDuration="6.76060667s" podCreationTimestamp="2025-10-03 13:02:36 +0000 UTC" firstStartedPulling="2025-10-03 13:02:37.708026582 +0000 UTC m=+766.111924417" lastFinishedPulling="2025-10-03 13:02:41.607935673 +0000 UTC m=+770.011833508" observedRunningTime="2025-10-03 13:02:42.756987327 +0000 UTC m=+771.160885162" watchObservedRunningTime="2025-10-03 13:02:42.76060667 +0000 UTC m=+771.164504505" Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.773711 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2z5bv"] Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.775796 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2z5bv"] Oct 03 13:02:42 crc kubenswrapper[4962]: I1003 13:02:42.798927 4962 scope.go:117] "RemoveContainer" containerID="baf9e8456c00e412ca83ab464e8c70091188ef10325a242538314edfdc134f84" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.366044 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5xcn"] Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.366277 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5xcn" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" containerName="registry-server" containerID="cri-o://61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b" gracePeriod=2 Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.716391 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.747482 4962 generic.go:334] "Generic (PLEG): container finished" podID="e89189cb-39da-4eac-a4be-c303b37b8447" containerID="61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b" exitCode=0 Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.747577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5xcn" event={"ID":"e89189cb-39da-4eac-a4be-c303b37b8447","Type":"ContainerDied","Data":"61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b"} Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.747601 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5xcn" event={"ID":"e89189cb-39da-4eac-a4be-c303b37b8447","Type":"ContainerDied","Data":"0e769749ebab05aea9e36c7ee193d834b40c2328c11630b3c06bb538b8db7c04"} Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.747619 4962 scope.go:117] "RemoveContainer" containerID="61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.747716 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5xcn" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.767748 4962 scope.go:117] "RemoveContainer" containerID="19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.782605 4962 scope.go:117] "RemoveContainer" containerID="a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.802426 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-catalog-content\") pod \"e89189cb-39da-4eac-a4be-c303b37b8447\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.802567 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-utilities\") pod \"e89189cb-39da-4eac-a4be-c303b37b8447\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.802590 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwkjd\" (UniqueName: \"kubernetes.io/projected/e89189cb-39da-4eac-a4be-c303b37b8447-kube-api-access-kwkjd\") pod \"e89189cb-39da-4eac-a4be-c303b37b8447\" (UID: \"e89189cb-39da-4eac-a4be-c303b37b8447\") " Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.803482 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-utilities" (OuterVolumeSpecName: "utilities") pod "e89189cb-39da-4eac-a4be-c303b37b8447" (UID: "e89189cb-39da-4eac-a4be-c303b37b8447"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.803693 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.806219 4962 scope.go:117] "RemoveContainer" containerID="61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.806710 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89189cb-39da-4eac-a4be-c303b37b8447-kube-api-access-kwkjd" (OuterVolumeSpecName: "kube-api-access-kwkjd") pod "e89189cb-39da-4eac-a4be-c303b37b8447" (UID: "e89189cb-39da-4eac-a4be-c303b37b8447"). InnerVolumeSpecName "kube-api-access-kwkjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:02:43 crc kubenswrapper[4962]: E1003 13:02:43.806801 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b\": container with ID starting with 61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b not found: ID does not exist" containerID="61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.806829 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b"} err="failed to get container status \"61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b\": rpc error: code = NotFound desc = could not find container \"61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b\": container with ID starting with 61667a069844d85558c9a327efc65687f36671287e1c0e838248c8d0f6f6559b not found: ID does not exist" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.806849 4962 scope.go:117] "RemoveContainer" containerID="19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b" Oct 03 13:02:43 crc kubenswrapper[4962]: E1003 13:02:43.807281 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b\": container with ID starting with 19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b not found: ID does not exist" containerID="19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.807303 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b"} err="failed to get container status \"19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b\": rpc error: code = NotFound desc = could not find container \"19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b\": container with ID starting with 19a8c1ce06dd54c347763b71dc7fdc512ccca20a3f6687252a3e39ce75b2f43b not found: ID does not exist" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.807316 4962 scope.go:117] "RemoveContainer" containerID="a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf" Oct 03 13:02:43 crc kubenswrapper[4962]: E1003 13:02:43.807559 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf\": container with ID starting with a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf not found: ID does not exist" containerID="a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.807630 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf"} err="failed to get container status \"a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf\": rpc error: code = NotFound desc = could not find container \"a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf\": container with ID starting with a673b5aa68dbe083755447e966b180f4c5fee3768c5eb30788682f62e12307bf not found: ID does not exist" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.886579 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e89189cb-39da-4eac-a4be-c303b37b8447" (UID: "e89189cb-39da-4eac-a4be-c303b37b8447"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.904611 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwkjd\" (UniqueName: \"kubernetes.io/projected/e89189cb-39da-4eac-a4be-c303b37b8447-kube-api-access-kwkjd\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:43 crc kubenswrapper[4962]: I1003 13:02:43.904668 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e89189cb-39da-4eac-a4be-c303b37b8447-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:44 crc kubenswrapper[4962]: I1003 13:02:44.079535 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5xcn"] Oct 03 13:02:44 crc kubenswrapper[4962]: I1003 13:02:44.083348 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5xcn"] Oct 03 13:02:44 crc kubenswrapper[4962]: I1003 13:02:44.233740 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" path="/var/lib/kubelet/pods/c0f4fa9d-4551-4ba6-9180-17e8e77325fa/volumes" Oct 03 13:02:44 crc kubenswrapper[4962]: I1003 13:02:44.234294 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" path="/var/lib/kubelet/pods/e89189cb-39da-4eac-a4be-c303b37b8447/volumes" Oct 03 13:02:46 crc kubenswrapper[4962]: I1003 13:02:46.689465 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:46 crc kubenswrapper[4962]: I1003 13:02:46.689803 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:46 crc kubenswrapper[4962]: I1003 13:02:46.726473 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:46 crc kubenswrapper[4962]: I1003 13:02:46.795710 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.386445 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lhnm9"] Oct 03 13:02:48 crc kubenswrapper[4962]: E1003 13:02:48.387043 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerName="registry-server" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.387084 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerName="registry-server" Oct 03 13:02:48 crc kubenswrapper[4962]: E1003 13:02:48.387104 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" containerName="registry-server" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.387113 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" containerName="registry-server" Oct 03 13:02:48 crc kubenswrapper[4962]: E1003 13:02:48.387124 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerName="extract-content" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.387132 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerName="extract-content" Oct 03 13:02:48 crc kubenswrapper[4962]: E1003 13:02:48.387168 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" containerName="extract-utilities" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.387177 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" containerName="extract-utilities" Oct 03 13:02:48 crc kubenswrapper[4962]: E1003 13:02:48.387193 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" containerName="extract-content" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.387202 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" containerName="extract-content" Oct 03 13:02:48 crc kubenswrapper[4962]: E1003 13:02:48.387242 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerName="extract-utilities" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.387253 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerName="extract-utilities" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.387427 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f4fa9d-4551-4ba6-9180-17e8e77325fa" containerName="registry-server" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.387447 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89189cb-39da-4eac-a4be-c303b37b8447" containerName="registry-server" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.388518 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.396055 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lhnm9"] Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.456826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-catalog-content\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.457033 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-utilities\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.457137 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sm8w\" (UniqueName: \"kubernetes.io/projected/43d7f543-4d3a-4dad-b525-7326386fd9ef-kube-api-access-2sm8w\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.558193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-utilities\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.558260 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sm8w\" (UniqueName: \"kubernetes.io/projected/43d7f543-4d3a-4dad-b525-7326386fd9ef-kube-api-access-2sm8w\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.558298 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-catalog-content\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.558783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-catalog-content\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.559045 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-utilities\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.577126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sm8w\" (UniqueName: \"kubernetes.io/projected/43d7f543-4d3a-4dad-b525-7326386fd9ef-kube-api-access-2sm8w\") pod \"community-operators-lhnm9\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:48 crc kubenswrapper[4962]: I1003 13:02:48.709562 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:49 crc kubenswrapper[4962]: I1003 13:02:49.236555 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lhnm9"] Oct 03 13:02:49 crc kubenswrapper[4962]: W1003 13:02:49.243020 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43d7f543_4d3a_4dad_b525_7326386fd9ef.slice/crio-f0e1c6233847451ac459d4e6e1744b796f340e3567c8a83a2787f37174a32c5d WatchSource:0}: Error finding container f0e1c6233847451ac459d4e6e1744b796f340e3567c8a83a2787f37174a32c5d: Status 404 returned error can't find the container with id f0e1c6233847451ac459d4e6e1744b796f340e3567c8a83a2787f37174a32c5d Oct 03 13:02:49 crc kubenswrapper[4962]: I1003 13:02:49.777200 4962 generic.go:334] "Generic (PLEG): container finished" podID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerID="4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036" exitCode=0 Oct 03 13:02:49 crc kubenswrapper[4962]: I1003 13:02:49.777255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnm9" event={"ID":"43d7f543-4d3a-4dad-b525-7326386fd9ef","Type":"ContainerDied","Data":"4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036"} Oct 03 13:02:49 crc kubenswrapper[4962]: I1003 13:02:49.777279 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnm9" event={"ID":"43d7f543-4d3a-4dad-b525-7326386fd9ef","Type":"ContainerStarted","Data":"f0e1c6233847451ac459d4e6e1744b796f340e3567c8a83a2787f37174a32c5d"} Oct 03 13:02:50 crc kubenswrapper[4962]: I1003 13:02:50.765657 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cv8l"] Oct 03 13:02:50 crc kubenswrapper[4962]: I1003 13:02:50.766159 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6cv8l" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerName="registry-server" containerID="cri-o://05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984" gracePeriod=2 Oct 03 13:02:50 crc kubenswrapper[4962]: I1003 13:02:50.785516 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnm9" event={"ID":"43d7f543-4d3a-4dad-b525-7326386fd9ef","Type":"ContainerStarted","Data":"e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2"} Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.126916 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.188384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-utilities\") pod \"55525ea2-d968-432b-b6fc-d4630a9e843a\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.188510 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpkf2\" (UniqueName: \"kubernetes.io/projected/55525ea2-d968-432b-b6fc-d4630a9e843a-kube-api-access-cpkf2\") pod \"55525ea2-d968-432b-b6fc-d4630a9e843a\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.188551 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-catalog-content\") pod \"55525ea2-d968-432b-b6fc-d4630a9e843a\" (UID: \"55525ea2-d968-432b-b6fc-d4630a9e843a\") " Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.203555 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55525ea2-d968-432b-b6fc-d4630a9e843a" (UID: "55525ea2-d968-432b-b6fc-d4630a9e843a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.204493 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55525ea2-d968-432b-b6fc-d4630a9e843a-kube-api-access-cpkf2" (OuterVolumeSpecName: "kube-api-access-cpkf2") pod "55525ea2-d968-432b-b6fc-d4630a9e843a" (UID: "55525ea2-d968-432b-b6fc-d4630a9e843a"). InnerVolumeSpecName "kube-api-access-cpkf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.206694 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-utilities" (OuterVolumeSpecName: "utilities") pod "55525ea2-d968-432b-b6fc-d4630a9e843a" (UID: "55525ea2-d968-432b-b6fc-d4630a9e843a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.292616 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.292686 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpkf2\" (UniqueName: \"kubernetes.io/projected/55525ea2-d968-432b-b6fc-d4630a9e843a-kube-api-access-cpkf2\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.292698 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55525ea2-d968-432b-b6fc-d4630a9e843a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.792591 4962 generic.go:334] "Generic (PLEG): container finished" podID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerID="05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984" exitCode=0 Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.792692 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6cv8l" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.792745 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cv8l" event={"ID":"55525ea2-d968-432b-b6fc-d4630a9e843a","Type":"ContainerDied","Data":"05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984"} Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.792820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cv8l" event={"ID":"55525ea2-d968-432b-b6fc-d4630a9e843a","Type":"ContainerDied","Data":"bacede4c6aa394286d1ee451d21e8c0662b3c88e15ef81ebf1d3b77fc111a52b"} Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.792856 4962 scope.go:117] "RemoveContainer" containerID="05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.795438 4962 generic.go:334] "Generic (PLEG): container finished" podID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerID="e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2" exitCode=0 Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.795473 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnm9" event={"ID":"43d7f543-4d3a-4dad-b525-7326386fd9ef","Type":"ContainerDied","Data":"e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2"} Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.814372 4962 scope.go:117] "RemoveContainer" containerID="602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.832290 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cv8l"] Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.835947 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cv8l"] Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.850596 4962 scope.go:117] "RemoveContainer" containerID="f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.863035 4962 scope.go:117] "RemoveContainer" containerID="05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984" Oct 03 13:02:51 crc kubenswrapper[4962]: E1003 13:02:51.863438 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984\": container with ID starting with 05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984 not found: ID does not exist" containerID="05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.863486 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984"} err="failed to get container status \"05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984\": rpc error: code = NotFound desc = could not find container \"05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984\": container with ID starting with 05830daf287ec33fbdae537c44feea0ab7dfa00b3d19848faebe35ee1855c984 not found: ID does not exist" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.863516 4962 scope.go:117] "RemoveContainer" containerID="602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758" Oct 03 13:02:51 crc kubenswrapper[4962]: E1003 13:02:51.864008 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758\": container with ID starting with 602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758 not found: ID does not exist" containerID="602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.864062 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758"} err="failed to get container status \"602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758\": rpc error: code = NotFound desc = could not find container \"602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758\": container with ID starting with 602b8d2d3759a06d07fb04559eebb7a1afbe09e16d7bcc43a020aa958aec8758 not found: ID does not exist" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.864086 4962 scope.go:117] "RemoveContainer" containerID="f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93" Oct 03 13:02:51 crc kubenswrapper[4962]: E1003 13:02:51.864399 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93\": container with ID starting with f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93 not found: ID does not exist" containerID="f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93" Oct 03 13:02:51 crc kubenswrapper[4962]: I1003 13:02:51.864419 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93"} err="failed to get container status \"f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93\": rpc error: code = NotFound desc = could not find container \"f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93\": container with ID starting with f256517557cf2fabc0d1b7053ddfc7a84a2e8447c724de622af9c3c9b815af93 not found: ID does not exist" Oct 03 13:02:52 crc kubenswrapper[4962]: I1003 13:02:52.234266 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" path="/var/lib/kubelet/pods/55525ea2-d968-432b-b6fc-d4630a9e843a/volumes" Oct 03 13:02:52 crc kubenswrapper[4962]: I1003 13:02:52.803721 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnm9" event={"ID":"43d7f543-4d3a-4dad-b525-7326386fd9ef","Type":"ContainerStarted","Data":"ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5"} Oct 03 13:02:52 crc kubenswrapper[4962]: I1003 13:02:52.819316 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lhnm9" podStartSLOduration=2.2677069530000002 podStartE2EDuration="4.819296511s" podCreationTimestamp="2025-10-03 13:02:48 +0000 UTC" firstStartedPulling="2025-10-03 13:02:49.778727904 +0000 UTC m=+778.182625739" lastFinishedPulling="2025-10-03 13:02:52.330317462 +0000 UTC m=+780.734215297" observedRunningTime="2025-10-03 13:02:52.816722285 +0000 UTC m=+781.220620120" watchObservedRunningTime="2025-10-03 13:02:52.819296511 +0000 UTC m=+781.223194346" Oct 03 13:02:58 crc kubenswrapper[4962]: I1003 13:02:58.710057 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:58 crc kubenswrapper[4962]: I1003 13:02:58.711084 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:58 crc kubenswrapper[4962]: I1003 13:02:58.746411 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:58 crc kubenswrapper[4962]: I1003 13:02:58.861737 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:02:58 crc kubenswrapper[4962]: I1003 13:02:58.973618 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lhnm9"] Oct 03 13:03:00 crc kubenswrapper[4962]: I1003 13:03:00.838565 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lhnm9" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerName="registry-server" containerID="cri-o://ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5" gracePeriod=2 Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.185336 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.321867 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-utilities\") pod \"43d7f543-4d3a-4dad-b525-7326386fd9ef\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.321913 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-catalog-content\") pod \"43d7f543-4d3a-4dad-b525-7326386fd9ef\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.321965 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sm8w\" (UniqueName: \"kubernetes.io/projected/43d7f543-4d3a-4dad-b525-7326386fd9ef-kube-api-access-2sm8w\") pod \"43d7f543-4d3a-4dad-b525-7326386fd9ef\" (UID: \"43d7f543-4d3a-4dad-b525-7326386fd9ef\") " Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.322611 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-utilities" (OuterVolumeSpecName: "utilities") pod "43d7f543-4d3a-4dad-b525-7326386fd9ef" (UID: "43d7f543-4d3a-4dad-b525-7326386fd9ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.329677 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d7f543-4d3a-4dad-b525-7326386fd9ef-kube-api-access-2sm8w" (OuterVolumeSpecName: "kube-api-access-2sm8w") pod "43d7f543-4d3a-4dad-b525-7326386fd9ef" (UID: "43d7f543-4d3a-4dad-b525-7326386fd9ef"). InnerVolumeSpecName "kube-api-access-2sm8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.380888 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43d7f543-4d3a-4dad-b525-7326386fd9ef" (UID: "43d7f543-4d3a-4dad-b525-7326386fd9ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.423771 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.423807 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d7f543-4d3a-4dad-b525-7326386fd9ef-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.423823 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sm8w\" (UniqueName: \"kubernetes.io/projected/43d7f543-4d3a-4dad-b525-7326386fd9ef-kube-api-access-2sm8w\") on node \"crc\" DevicePath \"\"" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.844717 4962 generic.go:334] "Generic (PLEG): container finished" podID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerID="ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5" exitCode=0 Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.844767 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnm9" event={"ID":"43d7f543-4d3a-4dad-b525-7326386fd9ef","Type":"ContainerDied","Data":"ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5"} Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.844803 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnm9" event={"ID":"43d7f543-4d3a-4dad-b525-7326386fd9ef","Type":"ContainerDied","Data":"f0e1c6233847451ac459d4e6e1744b796f340e3567c8a83a2787f37174a32c5d"} Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.844820 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhnm9" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.844824 4962 scope.go:117] "RemoveContainer" containerID="ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.869353 4962 scope.go:117] "RemoveContainer" containerID="e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.871435 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lhnm9"] Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.873989 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lhnm9"] Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.896370 4962 scope.go:117] "RemoveContainer" containerID="4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.909872 4962 scope.go:117] "RemoveContainer" containerID="ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5" Oct 03 13:03:01 crc kubenswrapper[4962]: E1003 13:03:01.910171 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5\": container with ID starting with ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5 not found: ID does not exist" containerID="ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.910201 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5"} err="failed to get container status \"ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5\": rpc error: code = NotFound desc = could not find container \"ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5\": container with ID starting with ef971a9e7e48e13bc9533cad4c0fc6f1c04dc768f8f9d3ecaf0ec55d5b4434e5 not found: ID does not exist" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.910222 4962 scope.go:117] "RemoveContainer" containerID="e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2" Oct 03 13:03:01 crc kubenswrapper[4962]: E1003 13:03:01.910378 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2\": container with ID starting with e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2 not found: ID does not exist" containerID="e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.910400 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2"} err="failed to get container status \"e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2\": rpc error: code = NotFound desc = could not find container \"e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2\": container with ID starting with e0e3ae7a07abd2e275f68372d691cd40775cae711b4503607462e2d25aa16bb2 not found: ID does not exist" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.910417 4962 scope.go:117] "RemoveContainer" containerID="4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036" Oct 03 13:03:01 crc kubenswrapper[4962]: E1003 13:03:01.910792 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036\": container with ID starting with 4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036 not found: ID does not exist" containerID="4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036" Oct 03 13:03:01 crc kubenswrapper[4962]: I1003 13:03:01.910814 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036"} err="failed to get container status \"4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036\": rpc error: code = NotFound desc = could not find container \"4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036\": container with ID starting with 4c54d5526ca07e973b5c50454a95cea5e3891f0e1bc3ffab37b1658467c49036 not found: ID does not exist" Oct 03 13:03:02 crc kubenswrapper[4962]: I1003 13:03:02.237190 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" path="/var/lib/kubelet/pods/43d7f543-4d3a-4dad-b525-7326386fd9ef/volumes" Oct 03 13:04:24 crc kubenswrapper[4962]: I1003 13:04:24.659426 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:04:24 crc kubenswrapper[4962]: I1003 13:04:24.660209 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:04:54 crc kubenswrapper[4962]: I1003 13:04:54.660529 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:04:54 crc kubenswrapper[4962]: I1003 13:04:54.661034 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:05:24 crc kubenswrapper[4962]: I1003 13:05:24.659299 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:05:24 crc kubenswrapper[4962]: I1003 13:05:24.659845 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:05:24 crc kubenswrapper[4962]: I1003 13:05:24.659885 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:05:24 crc kubenswrapper[4962]: I1003 13:05:24.660312 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"076795ece51777028d4903d02e01c23aad08fd0c510374d97a2d753df68d0eea"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:05:24 crc kubenswrapper[4962]: I1003 13:05:24.660358 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://076795ece51777028d4903d02e01c23aad08fd0c510374d97a2d753df68d0eea" gracePeriod=600 Oct 03 13:05:25 crc kubenswrapper[4962]: I1003 13:05:25.547418 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="076795ece51777028d4903d02e01c23aad08fd0c510374d97a2d753df68d0eea" exitCode=0 Oct 03 13:05:25 crc kubenswrapper[4962]: I1003 13:05:25.547465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"076795ece51777028d4903d02e01c23aad08fd0c510374d97a2d753df68d0eea"} Oct 03 13:05:25 crc kubenswrapper[4962]: I1003 13:05:25.547740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"a8662442e8f36173a3b3425f41847fc665cbcd80d634980f74f9a3c41a264cea"} Oct 03 13:05:25 crc kubenswrapper[4962]: I1003 13:05:25.547763 4962 scope.go:117] "RemoveContainer" containerID="1f2cdc13b11da65d799c535afd6e9add75247936ceedca31f891b7fb2d791205" Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.880425 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ksp7d"] Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.881400 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovn-controller" containerID="cri-o://5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866" gracePeriod=30 Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.881451 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3" gracePeriod=30 Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.881451 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kube-rbac-proxy-node" containerID="cri-o://31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c" gracePeriod=30 Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.881491 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="sbdb" containerID="cri-o://daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1" gracePeriod=30 Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.881539 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovn-acl-logging" containerID="cri-o://f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1" gracePeriod=30 Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.881436 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="nbdb" containerID="cri-o://2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47" gracePeriod=30 Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.881571 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="northd" containerID="cri-o://0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56" gracePeriod=30 Oct 03 13:05:33 crc kubenswrapper[4962]: I1003 13:05:33.908512 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" containerID="cri-o://5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a" gracePeriod=30 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.283109 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/3.log" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.285512 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovn-acl-logging/0.log" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.286082 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovn-controller/0.log" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.286626 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335441 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tlkr9"] Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335624 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335649 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335660 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335665 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335674 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335680 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335686 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovn-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335691 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovn-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335702 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335707 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335718 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovn-acl-logging" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335726 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovn-acl-logging" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335742 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="sbdb" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335749 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="sbdb" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335757 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerName="extract-utilities" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335765 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerName="extract-utilities" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335776 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kube-rbac-proxy-node" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335783 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kube-rbac-proxy-node" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335791 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kubecfg-setup" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335814 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kubecfg-setup" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335824 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerName="extract-utilities" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335831 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerName="extract-utilities" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335840 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerName="extract-content" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335846 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerName="extract-content" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335853 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerName="registry-server" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335862 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerName="registry-server" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335875 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="nbdb" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335882 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="nbdb" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335891 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerName="registry-server" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335899 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerName="registry-server" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335911 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerName="extract-content" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335919 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerName="extract-content" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.335928 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="northd" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.335933 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="northd" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336019 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336028 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336036 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="nbdb" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336044 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="sbdb" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336052 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovn-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336059 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336066 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kube-rbac-proxy-node" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336073 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="northd" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336078 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d7f543-4d3a-4dad-b525-7326386fd9ef" containerName="registry-server" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336087 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336093 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="55525ea2-d968-432b-b6fc-d4630a9e843a" containerName="registry-server" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336101 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336108 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovn-acl-logging" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.336182 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336189 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.336198 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336207 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.336315 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerName="ovnkube-controller" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.337767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464778 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-systemd-units\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464817 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-bin\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-config\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464879 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovn-node-metrics-cert\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464881 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464914 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-netns\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464933 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-ovn-kubernetes\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464960 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-kubelet\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464977 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-var-lib-openvswitch\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.464995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-env-overrides\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465012 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-ovn\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465027 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-systemd\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465046 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-openvswitch\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465079 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-script-lib\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465107 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-log-socket\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465126 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-slash\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465155 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465170 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465185 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465211 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465212 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdwhf\" (UniqueName: \"kubernetes.io/projected/90186d9d-0ac4-4959-9fd8-b044098dc6ae-kube-api-access-qdwhf\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465304 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-netd\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465352 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-etc-openvswitch\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465377 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-node-log\") pod \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\" (UID: \"90186d9d-0ac4-4959-9fd8-b044098dc6ae\") " Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465500 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465529 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465539 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465571 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465590 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465595 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465627 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-slash" (OuterVolumeSpecName: "host-slash") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465658 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465705 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-log-socket" (OuterVolumeSpecName: "log-socket") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465626 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-node-log\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465760 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-systemd\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465735 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-node-log" (OuterVolumeSpecName: "node-log") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-env-overrides\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465864 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465878 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-run-ovn-kubernetes\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465910 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-kubelet\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465945 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-cni-bin\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.465983 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-run-netns\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466052 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-systemd-units\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466071 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-var-lib-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466091 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-log-socket\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466109 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-etc-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466177 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-slash\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovnkube-script-lib\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466262 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrhj\" (UniqueName: \"kubernetes.io/projected/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-kube-api-access-7xrhj\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466283 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-cni-netd\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466316 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovnkube-config\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovn-node-metrics-cert\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466380 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-ovn\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466413 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466463 4962 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466476 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466489 4962 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466499 4962 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466511 4962 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466524 4962 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466536 4962 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466546 4962 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466557 4962 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466597 4962 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466609 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466620 4962 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466658 4962 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466672 4962 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466684 4962 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466695 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90186d9d-0ac4-4959-9fd8-b044098dc6ae-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.466707 4962 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.471341 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90186d9d-0ac4-4959-9fd8-b044098dc6ae-kube-api-access-qdwhf" (OuterVolumeSpecName: "kube-api-access-qdwhf") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "kube-api-access-qdwhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.471526 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.478909 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "90186d9d-0ac4-4959-9fd8-b044098dc6ae" (UID: "90186d9d-0ac4-4959-9fd8-b044098dc6ae"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567550 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567598 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-slash\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567619 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovnkube-script-lib\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567723 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-slash\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567769 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xrhj\" (UniqueName: \"kubernetes.io/projected/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-kube-api-access-7xrhj\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567836 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-cni-netd\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567894 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovnkube-config\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-cni-netd\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.567946 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovn-node-metrics-cert\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568011 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-ovn\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568091 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-node-log\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568098 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-ovn\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568123 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-systemd\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-env-overrides\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568202 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-run-ovn-kubernetes\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568214 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-run-systemd\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-kubelet\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568240 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-node-log\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568274 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-run-ovn-kubernetes\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-cni-bin\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568283 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-cni-bin\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568308 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-kubelet\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568358 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-run-netns\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568365 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovnkube-script-lib\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-host-run-netns\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-var-lib-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568504 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-systemd-units\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568523 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-var-lib-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-etc-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-etc-openvswitch\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568552 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-log-socket\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568568 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-log-socket\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568597 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-systemd-units\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568650 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdwhf\" (UniqueName: \"kubernetes.io/projected/90186d9d-0ac4-4959-9fd8-b044098dc6ae-kube-api-access-qdwhf\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568663 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90186d9d-0ac4-4959-9fd8-b044098dc6ae-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568672 4962 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90186d9d-0ac4-4959-9fd8-b044098dc6ae-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568676 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovnkube-config\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.568854 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-env-overrides\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.571456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-ovn-node-metrics-cert\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.584121 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xrhj\" (UniqueName: \"kubernetes.io/projected/4d3d7ba9-5706-4928-80d9-103ac7aa9e52-kube-api-access-7xrhj\") pod \"ovnkube-node-tlkr9\" (UID: \"4d3d7ba9-5706-4928-80d9-103ac7aa9e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.594346 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sdd6t_fbc64268-3e78-44a2-8116-b62b5c13f005/kube-multus/1.log" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.594740 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sdd6t_fbc64268-3e78-44a2-8116-b62b5c13f005/kube-multus/0.log" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.594778 4962 generic.go:334] "Generic (PLEG): container finished" podID="fbc64268-3e78-44a2-8116-b62b5c13f005" containerID="55381b2122f7d63231ef917dc3901e367b66dd7e75eb6fb7c3d049b81113c77f" exitCode=2 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.594831 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sdd6t" event={"ID":"fbc64268-3e78-44a2-8116-b62b5c13f005","Type":"ContainerDied","Data":"55381b2122f7d63231ef917dc3901e367b66dd7e75eb6fb7c3d049b81113c77f"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.594864 4962 scope.go:117] "RemoveContainer" containerID="087c5eaff8dd2b9ccb73982ae2022d139077f3a93f750a8d7f7fe8bb4a8f643e" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.595213 4962 scope.go:117] "RemoveContainer" containerID="55381b2122f7d63231ef917dc3901e367b66dd7e75eb6fb7c3d049b81113c77f" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.598811 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovnkube-controller/3.log" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.607556 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovn-acl-logging/0.log" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.609460 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ksp7d_90186d9d-0ac4-4959-9fd8-b044098dc6ae/ovn-controller/0.log" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611015 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a" exitCode=0 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611045 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1" exitCode=0 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611056 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47" exitCode=0 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611065 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56" exitCode=0 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611073 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3" exitCode=0 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611083 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c" exitCode=0 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611091 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1" exitCode=143 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611099 4962 generic.go:334] "Generic (PLEG): container finished" podID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" containerID="5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866" exitCode=143 Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611124 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611128 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611933 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.611988 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612009 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612024 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612038 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612052 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612060 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612070 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612078 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612084 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612091 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612098 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612104 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612111 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612121 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612132 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612140 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612151 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612160 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612166 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612174 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612180 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612186 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612195 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612201 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612210 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612221 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612229 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612235 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612242 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612248 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612255 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612261 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612268 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612275 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612281 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612290 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ksp7d" event={"ID":"90186d9d-0ac4-4959-9fd8-b044098dc6ae","Type":"ContainerDied","Data":"9730df0aa7f1039b437ee08c7070e4944ec76d68d0870e8e1bf954355236280d"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612303 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612311 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612319 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612326 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612332 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612339 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612345 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612352 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612358 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.612364 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.636674 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ksp7d"] Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.640286 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ksp7d"] Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.651505 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:34 crc kubenswrapper[4962]: W1003 13:05:34.684138 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d3d7ba9_5706_4928_80d9_103ac7aa9e52.slice/crio-4c40968d343b65b919f26dd54ba953efe12ea8e545aaa8c4eb9f7c16bc78419a WatchSource:0}: Error finding container 4c40968d343b65b919f26dd54ba953efe12ea8e545aaa8c4eb9f7c16bc78419a: Status 404 returned error can't find the container with id 4c40968d343b65b919f26dd54ba953efe12ea8e545aaa8c4eb9f7c16bc78419a Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.726829 4962 scope.go:117] "RemoveContainer" containerID="5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.742719 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.755506 4962 scope.go:117] "RemoveContainer" containerID="daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.767502 4962 scope.go:117] "RemoveContainer" containerID="2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.780080 4962 scope.go:117] "RemoveContainer" containerID="0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.792456 4962 scope.go:117] "RemoveContainer" containerID="fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.813595 4962 scope.go:117] "RemoveContainer" containerID="31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.827569 4962 scope.go:117] "RemoveContainer" containerID="f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.903609 4962 scope.go:117] "RemoveContainer" containerID="5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.919145 4962 scope.go:117] "RemoveContainer" containerID="857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.930524 4962 scope.go:117] "RemoveContainer" containerID="5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.931025 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": container with ID starting with 5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a not found: ID does not exist" containerID="5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.931145 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} err="failed to get container status \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": rpc error: code = NotFound desc = could not find container \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": container with ID starting with 5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.931238 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.931600 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": container with ID starting with 57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd not found: ID does not exist" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.931724 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} err="failed to get container status \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": rpc error: code = NotFound desc = could not find container \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": container with ID starting with 57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.931815 4962 scope.go:117] "RemoveContainer" containerID="daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.932559 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": container with ID starting with daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1 not found: ID does not exist" containerID="daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.932740 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} err="failed to get container status \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": rpc error: code = NotFound desc = could not find container \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": container with ID starting with daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.932839 4962 scope.go:117] "RemoveContainer" containerID="2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.933880 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": container with ID starting with 2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47 not found: ID does not exist" containerID="2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.933993 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} err="failed to get container status \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": rpc error: code = NotFound desc = could not find container \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": container with ID starting with 2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.934089 4962 scope.go:117] "RemoveContainer" containerID="0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.934416 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": container with ID starting with 0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56 not found: ID does not exist" containerID="0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.934521 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} err="failed to get container status \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": rpc error: code = NotFound desc = could not find container \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": container with ID starting with 0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.934617 4962 scope.go:117] "RemoveContainer" containerID="fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.934960 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": container with ID starting with fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3 not found: ID does not exist" containerID="fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.935074 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} err="failed to get container status \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": rpc error: code = NotFound desc = could not find container \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": container with ID starting with fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.935174 4962 scope.go:117] "RemoveContainer" containerID="31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.935457 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": container with ID starting with 31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c not found: ID does not exist" containerID="31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.935554 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} err="failed to get container status \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": rpc error: code = NotFound desc = could not find container \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": container with ID starting with 31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.935661 4962 scope.go:117] "RemoveContainer" containerID="f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.936040 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": container with ID starting with f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1 not found: ID does not exist" containerID="f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.936178 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} err="failed to get container status \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": rpc error: code = NotFound desc = could not find container \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": container with ID starting with f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.936267 4962 scope.go:117] "RemoveContainer" containerID="5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.936577 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": container with ID starting with 5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866 not found: ID does not exist" containerID="5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.936697 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} err="failed to get container status \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": rpc error: code = NotFound desc = could not find container \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": container with ID starting with 5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.936789 4962 scope.go:117] "RemoveContainer" containerID="857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2" Oct 03 13:05:34 crc kubenswrapper[4962]: E1003 13:05:34.937353 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": container with ID starting with 857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2 not found: ID does not exist" containerID="857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.937454 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} err="failed to get container status \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": rpc error: code = NotFound desc = could not find container \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": container with ID starting with 857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.937550 4962 scope.go:117] "RemoveContainer" containerID="5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.937960 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} err="failed to get container status \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": rpc error: code = NotFound desc = could not find container \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": container with ID starting with 5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.938103 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.938499 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} err="failed to get container status \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": rpc error: code = NotFound desc = could not find container \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": container with ID starting with 57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.938622 4962 scope.go:117] "RemoveContainer" containerID="daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.939047 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} err="failed to get container status \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": rpc error: code = NotFound desc = could not find container \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": container with ID starting with daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.939193 4962 scope.go:117] "RemoveContainer" containerID="2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.939721 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} err="failed to get container status \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": rpc error: code = NotFound desc = could not find container \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": container with ID starting with 2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.939923 4962 scope.go:117] "RemoveContainer" containerID="0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.940587 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} err="failed to get container status \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": rpc error: code = NotFound desc = could not find container \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": container with ID starting with 0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.940822 4962 scope.go:117] "RemoveContainer" containerID="fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.941230 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} err="failed to get container status \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": rpc error: code = NotFound desc = could not find container \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": container with ID starting with fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.941360 4962 scope.go:117] "RemoveContainer" containerID="31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.941724 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} err="failed to get container status \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": rpc error: code = NotFound desc = could not find container \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": container with ID starting with 31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.941883 4962 scope.go:117] "RemoveContainer" containerID="f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.942311 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} err="failed to get container status \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": rpc error: code = NotFound desc = could not find container \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": container with ID starting with f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.942462 4962 scope.go:117] "RemoveContainer" containerID="5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.942887 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} err="failed to get container status \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": rpc error: code = NotFound desc = could not find container \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": container with ID starting with 5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.943027 4962 scope.go:117] "RemoveContainer" containerID="857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.943345 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} err="failed to get container status \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": rpc error: code = NotFound desc = could not find container \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": container with ID starting with 857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.943451 4962 scope.go:117] "RemoveContainer" containerID="5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.943808 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} err="failed to get container status \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": rpc error: code = NotFound desc = could not find container \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": container with ID starting with 5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.943900 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.944208 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} err="failed to get container status \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": rpc error: code = NotFound desc = could not find container \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": container with ID starting with 57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.944327 4962 scope.go:117] "RemoveContainer" containerID="daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.944623 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} err="failed to get container status \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": rpc error: code = NotFound desc = could not find container \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": container with ID starting with daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.944744 4962 scope.go:117] "RemoveContainer" containerID="2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.945031 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} err="failed to get container status \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": rpc error: code = NotFound desc = could not find container \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": container with ID starting with 2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.945127 4962 scope.go:117] "RemoveContainer" containerID="0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.945386 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} err="failed to get container status \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": rpc error: code = NotFound desc = could not find container \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": container with ID starting with 0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.945485 4962 scope.go:117] "RemoveContainer" containerID="fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.945770 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} err="failed to get container status \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": rpc error: code = NotFound desc = could not find container \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": container with ID starting with fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.945894 4962 scope.go:117] "RemoveContainer" containerID="31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.946186 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} err="failed to get container status \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": rpc error: code = NotFound desc = could not find container \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": container with ID starting with 31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.946311 4962 scope.go:117] "RemoveContainer" containerID="f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.946590 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} err="failed to get container status \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": rpc error: code = NotFound desc = could not find container \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": container with ID starting with f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.946698 4962 scope.go:117] "RemoveContainer" containerID="5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.947323 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} err="failed to get container status \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": rpc error: code = NotFound desc = could not find container \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": container with ID starting with 5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.947414 4962 scope.go:117] "RemoveContainer" containerID="857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.947690 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} err="failed to get container status \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": rpc error: code = NotFound desc = could not find container \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": container with ID starting with 857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.947796 4962 scope.go:117] "RemoveContainer" containerID="5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.948102 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a"} err="failed to get container status \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": rpc error: code = NotFound desc = could not find container \"5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a\": container with ID starting with 5fb6150ea04e5f5b54cf103491ae188a7ea9c636e9761537cac84914cba53b6a not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.948204 4962 scope.go:117] "RemoveContainer" containerID="57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.948583 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd"} err="failed to get container status \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": rpc error: code = NotFound desc = could not find container \"57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd\": container with ID starting with 57fcf06043f46f40efa58819dcfaaaa37c0426f8b80a9b322df4bdfb3ddbcddd not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.948716 4962 scope.go:117] "RemoveContainer" containerID="daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.948994 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1"} err="failed to get container status \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": rpc error: code = NotFound desc = could not find container \"daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1\": container with ID starting with daa14c2a4cd54fd8836e10307d314fd9dd641d18674d2d2114208d2e618caac1 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.949012 4962 scope.go:117] "RemoveContainer" containerID="2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.949189 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47"} err="failed to get container status \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": rpc error: code = NotFound desc = could not find container \"2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47\": container with ID starting with 2630c19d86979311dc63153c947a4e849ffe1402108079c6699f3df92ebb7e47 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.949327 4962 scope.go:117] "RemoveContainer" containerID="0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.949591 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56"} err="failed to get container status \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": rpc error: code = NotFound desc = could not find container \"0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56\": container with ID starting with 0c65d4562ab0f5155301d49bd67fbf71cdd75f1386929829c5dd4f17dca05f56 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.949611 4962 scope.go:117] "RemoveContainer" containerID="fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.949808 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3"} err="failed to get container status \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": rpc error: code = NotFound desc = could not find container \"fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3\": container with ID starting with fa11336dddae56db41c306de6b4a86138903af2d8b52858c5bf1242e62420ea3 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.949912 4962 scope.go:117] "RemoveContainer" containerID="31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.950167 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c"} err="failed to get container status \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": rpc error: code = NotFound desc = could not find container \"31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c\": container with ID starting with 31a1e7dc31aba5625ca65f47d8c4fa9ab39b2d80e678691f10101c2bb621641c not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.950260 4962 scope.go:117] "RemoveContainer" containerID="f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.950539 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1"} err="failed to get container status \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": rpc error: code = NotFound desc = could not find container \"f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1\": container with ID starting with f3baa5cc01690b4d72b64bc6db1d11693a6429e4d7a7a3a4861159d2e3fb73d1 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.950662 4962 scope.go:117] "RemoveContainer" containerID="5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.951035 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866"} err="failed to get container status \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": rpc error: code = NotFound desc = could not find container \"5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866\": container with ID starting with 5da63aff55d8c330078d3457fe3847bbfe32ae2a11fc74d906f10c82299c7866 not found: ID does not exist" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.951052 4962 scope.go:117] "RemoveContainer" containerID="857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2" Oct 03 13:05:34 crc kubenswrapper[4962]: I1003 13:05:34.951230 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2"} err="failed to get container status \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": rpc error: code = NotFound desc = could not find container \"857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2\": container with ID starting with 857083ce6bb779c116562cbac25b8e2e992c758c7ae0bd1441a168f5803b31e2 not found: ID does not exist" Oct 03 13:05:35 crc kubenswrapper[4962]: I1003 13:05:35.617285 4962 generic.go:334] "Generic (PLEG): container finished" podID="4d3d7ba9-5706-4928-80d9-103ac7aa9e52" containerID="14a5037a377e5a8723395024a9288cae3e69363af7b75434970951d709d55fc1" exitCode=0 Oct 03 13:05:35 crc kubenswrapper[4962]: I1003 13:05:35.617428 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerDied","Data":"14a5037a377e5a8723395024a9288cae3e69363af7b75434970951d709d55fc1"} Oct 03 13:05:35 crc kubenswrapper[4962]: I1003 13:05:35.617592 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"4c40968d343b65b919f26dd54ba953efe12ea8e545aaa8c4eb9f7c16bc78419a"} Oct 03 13:05:35 crc kubenswrapper[4962]: I1003 13:05:35.620112 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sdd6t_fbc64268-3e78-44a2-8116-b62b5c13f005/kube-multus/1.log" Oct 03 13:05:35 crc kubenswrapper[4962]: I1003 13:05:35.620194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sdd6t" event={"ID":"fbc64268-3e78-44a2-8116-b62b5c13f005","Type":"ContainerStarted","Data":"879238289cc6f8e68afd16f3f8b2cfeafabbcb208d40388e0d821f382d7f99a0"} Oct 03 13:05:36 crc kubenswrapper[4962]: I1003 13:05:36.234442 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90186d9d-0ac4-4959-9fd8-b044098dc6ae" path="/var/lib/kubelet/pods/90186d9d-0ac4-4959-9fd8-b044098dc6ae/volumes" Oct 03 13:05:36 crc kubenswrapper[4962]: I1003 13:05:36.631083 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"c3aafaa37bb7f3b115b2ce087fb0fae658ab40c25515d40ecda86154cddd7506"} Oct 03 13:05:36 crc kubenswrapper[4962]: I1003 13:05:36.631359 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"946066419c6dfab21ba8ba9e64b77f9fedf3540acd2dc90cc7d136a41cf5dc4e"} Oct 03 13:05:36 crc kubenswrapper[4962]: I1003 13:05:36.631370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"383416e9f24ff222c6f3f327722f8619bb68927c2bbc8a0fbb4f6d7e25b509c1"} Oct 03 13:05:36 crc kubenswrapper[4962]: I1003 13:05:36.631379 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"62bb6bb255fd6341f24bda703f649c31cb2072950900bc8d81e87eb4eecf1490"} Oct 03 13:05:36 crc kubenswrapper[4962]: I1003 13:05:36.631387 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"247b8ecc450b4509f28d2789ccf65eb03ad0c0562cffd3888cb5b35ef3bbc27b"} Oct 03 13:05:36 crc kubenswrapper[4962]: I1003 13:05:36.631399 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"e4570dd763f38f399202a24d59ce91533c11fae7727c6ce1704f1d63edd8714a"} Oct 03 13:05:38 crc kubenswrapper[4962]: I1003 13:05:38.647050 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"63a25f44db8ad31cb9118da19a594f5da25ffeb5be2a2b78f0cc4898fa84c554"} Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.661196 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" event={"ID":"4d3d7ba9-5706-4928-80d9-103ac7aa9e52","Type":"ContainerStarted","Data":"8ff73c4abd564bbaea8c1c442b4b20ed87f37a5a41f702f81b7c6037bde9d6bf"} Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.661446 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.661461 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.690607 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" podStartSLOduration=6.690593088 podStartE2EDuration="6.690593088s" podCreationTimestamp="2025-10-03 13:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:05:40.687773473 +0000 UTC m=+949.091671308" watchObservedRunningTime="2025-10-03 13:05:40.690593088 +0000 UTC m=+949.094490923" Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.695495 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.892114 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-5j546"] Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.892712 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.897313 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-v8h8n" Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.897395 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.897526 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 13:05:40 crc kubenswrapper[4962]: I1003 13:05:40.898919 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.048392 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-crc-storage\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.048457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-node-mnt\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.048498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4nzb\" (UniqueName: \"kubernetes.io/projected/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-kube-api-access-n4nzb\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.149978 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-crc-storage\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.150061 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-node-mnt\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.150114 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4nzb\" (UniqueName: \"kubernetes.io/projected/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-kube-api-access-n4nzb\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.150438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-node-mnt\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.150854 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-crc-storage\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.166093 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4nzb\" (UniqueName: \"kubernetes.io/projected/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-kube-api-access-n4nzb\") pod \"crc-storage-crc-5j546\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.205828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: E1003 13:05:41.229546 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5j546_crc-storage_ddc5c3ec-d428-4310-8e25-cc8e5416fb08_0(6598a59bbb5d0658d2f7faf20282118a63a00a77e2fde87a7201043e3ccfa972): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 13:05:41 crc kubenswrapper[4962]: E1003 13:05:41.229618 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5j546_crc-storage_ddc5c3ec-d428-4310-8e25-cc8e5416fb08_0(6598a59bbb5d0658d2f7faf20282118a63a00a77e2fde87a7201043e3ccfa972): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: E1003 13:05:41.229659 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5j546_crc-storage_ddc5c3ec-d428-4310-8e25-cc8e5416fb08_0(6598a59bbb5d0658d2f7faf20282118a63a00a77e2fde87a7201043e3ccfa972): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:41 crc kubenswrapper[4962]: E1003 13:05:41.229711 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5j546_crc-storage(ddc5c3ec-d428-4310-8e25-cc8e5416fb08)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5j546_crc-storage(ddc5c3ec-d428-4310-8e25-cc8e5416fb08)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5j546_crc-storage_ddc5c3ec-d428-4310-8e25-cc8e5416fb08_0(6598a59bbb5d0658d2f7faf20282118a63a00a77e2fde87a7201043e3ccfa972): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5j546" podUID="ddc5c3ec-d428-4310-8e25-cc8e5416fb08" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.665971 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:41 crc kubenswrapper[4962]: I1003 13:05:41.692351 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:05:42 crc kubenswrapper[4962]: I1003 13:05:42.137466 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5j546"] Oct 03 13:05:42 crc kubenswrapper[4962]: I1003 13:05:42.137597 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:42 crc kubenswrapper[4962]: I1003 13:05:42.138129 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:42 crc kubenswrapper[4962]: E1003 13:05:42.162316 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5j546_crc-storage_ddc5c3ec-d428-4310-8e25-cc8e5416fb08_0(ea37402d9795591fafe344c68f982d586565001266c7267d99daa572b2c868f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 13:05:42 crc kubenswrapper[4962]: E1003 13:05:42.162373 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5j546_crc-storage_ddc5c3ec-d428-4310-8e25-cc8e5416fb08_0(ea37402d9795591fafe344c68f982d586565001266c7267d99daa572b2c868f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:42 crc kubenswrapper[4962]: E1003 13:05:42.162393 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5j546_crc-storage_ddc5c3ec-d428-4310-8e25-cc8e5416fb08_0(ea37402d9795591fafe344c68f982d586565001266c7267d99daa572b2c868f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:42 crc kubenswrapper[4962]: E1003 13:05:42.162443 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5j546_crc-storage(ddc5c3ec-d428-4310-8e25-cc8e5416fb08)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5j546_crc-storage(ddc5c3ec-d428-4310-8e25-cc8e5416fb08)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5j546_crc-storage_ddc5c3ec-d428-4310-8e25-cc8e5416fb08_0(ea37402d9795591fafe344c68f982d586565001266c7267d99daa572b2c868f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5j546" podUID="ddc5c3ec-d428-4310-8e25-cc8e5416fb08" Oct 03 13:05:57 crc kubenswrapper[4962]: I1003 13:05:57.226282 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:57 crc kubenswrapper[4962]: I1003 13:05:57.227165 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:05:57 crc kubenswrapper[4962]: I1003 13:05:57.402074 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5j546"] Oct 03 13:05:57 crc kubenswrapper[4962]: W1003 13:05:57.411461 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddc5c3ec_d428_4310_8e25_cc8e5416fb08.slice/crio-378740c2e8ecec5b825f5dac58178d85165209cce71fa758ceb2c2355844f6f4 WatchSource:0}: Error finding container 378740c2e8ecec5b825f5dac58178d85165209cce71fa758ceb2c2355844f6f4: Status 404 returned error can't find the container with id 378740c2e8ecec5b825f5dac58178d85165209cce71fa758ceb2c2355844f6f4 Oct 03 13:05:57 crc kubenswrapper[4962]: I1003 13:05:57.743115 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5j546" event={"ID":"ddc5c3ec-d428-4310-8e25-cc8e5416fb08","Type":"ContainerStarted","Data":"378740c2e8ecec5b825f5dac58178d85165209cce71fa758ceb2c2355844f6f4"} Oct 03 13:05:58 crc kubenswrapper[4962]: I1003 13:05:58.750765 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5j546" event={"ID":"ddc5c3ec-d428-4310-8e25-cc8e5416fb08","Type":"ContainerStarted","Data":"8eafc8a9744a26bbc6dc88f8a3287d7e2a69ed60f17c4123a7b6f28fbceed0b3"} Oct 03 13:05:59 crc kubenswrapper[4962]: I1003 13:05:59.756750 4962 generic.go:334] "Generic (PLEG): container finished" podID="ddc5c3ec-d428-4310-8e25-cc8e5416fb08" containerID="8eafc8a9744a26bbc6dc88f8a3287d7e2a69ed60f17c4123a7b6f28fbceed0b3" exitCode=0 Oct 03 13:05:59 crc kubenswrapper[4962]: I1003 13:05:59.756795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5j546" event={"ID":"ddc5c3ec-d428-4310-8e25-cc8e5416fb08","Type":"ContainerDied","Data":"8eafc8a9744a26bbc6dc88f8a3287d7e2a69ed60f17c4123a7b6f28fbceed0b3"} Oct 03 13:05:59 crc kubenswrapper[4962]: I1003 13:05:59.967513 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.087339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4nzb\" (UniqueName: \"kubernetes.io/projected/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-kube-api-access-n4nzb\") pod \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.087405 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-crc-storage\") pod \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.087507 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-node-mnt\") pod \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\" (UID: \"ddc5c3ec-d428-4310-8e25-cc8e5416fb08\") " Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.087761 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "ddc5c3ec-d428-4310-8e25-cc8e5416fb08" (UID: "ddc5c3ec-d428-4310-8e25-cc8e5416fb08"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.099574 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-kube-api-access-n4nzb" (OuterVolumeSpecName: "kube-api-access-n4nzb") pod "ddc5c3ec-d428-4310-8e25-cc8e5416fb08" (UID: "ddc5c3ec-d428-4310-8e25-cc8e5416fb08"). InnerVolumeSpecName "kube-api-access-n4nzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.102265 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "ddc5c3ec-d428-4310-8e25-cc8e5416fb08" (UID: "ddc5c3ec-d428-4310-8e25-cc8e5416fb08"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.189798 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4nzb\" (UniqueName: \"kubernetes.io/projected/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-kube-api-access-n4nzb\") on node \"crc\" DevicePath \"\"" Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.189870 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.189882 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ddc5c3ec-d428-4310-8e25-cc8e5416fb08-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.762881 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5j546" event={"ID":"ddc5c3ec-d428-4310-8e25-cc8e5416fb08","Type":"ContainerDied","Data":"378740c2e8ecec5b825f5dac58178d85165209cce71fa758ceb2c2355844f6f4"} Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.762926 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="378740c2e8ecec5b825f5dac58178d85165209cce71fa758ceb2c2355844f6f4" Oct 03 13:06:00 crc kubenswrapper[4962]: I1003 13:06:00.762933 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5j546" Oct 03 13:06:04 crc kubenswrapper[4962]: I1003 13:06:04.674280 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlkr9" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.104897 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt"] Oct 03 13:06:08 crc kubenswrapper[4962]: E1003 13:06:08.106182 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc5c3ec-d428-4310-8e25-cc8e5416fb08" containerName="storage" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.106301 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc5c3ec-d428-4310-8e25-cc8e5416fb08" containerName="storage" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.106716 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc5c3ec-d428-4310-8e25-cc8e5416fb08" containerName="storage" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.108431 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.110190 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt"] Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.110623 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.180820 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnr5\" (UniqueName: \"kubernetes.io/projected/f4861596-c6c9-4979-b4d8-fe9858724265-kube-api-access-vwnr5\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.180929 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.181009 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.282589 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnr5\" (UniqueName: \"kubernetes.io/projected/f4861596-c6c9-4979-b4d8-fe9858724265-kube-api-access-vwnr5\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.282745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.282816 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.283371 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.283450 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.304434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnr5\" (UniqueName: \"kubernetes.io/projected/f4861596-c6c9-4979-b4d8-fe9858724265-kube-api-access-vwnr5\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.426802 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.593055 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt"] Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.801029 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" event={"ID":"f4861596-c6c9-4979-b4d8-fe9858724265","Type":"ContainerStarted","Data":"c7ac119b3c0dd312e91ff7222fc753aae179ecc355acdab65f64e69cbba345d7"} Oct 03 13:06:08 crc kubenswrapper[4962]: I1003 13:06:08.801069 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" event={"ID":"f4861596-c6c9-4979-b4d8-fe9858724265","Type":"ContainerStarted","Data":"d2059c4a3dfe1f573e61a14fad9da8af72c065ebc6776ba90fc79d7f180d7a6f"} Oct 03 13:06:09 crc kubenswrapper[4962]: I1003 13:06:09.809597 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4861596-c6c9-4979-b4d8-fe9858724265" containerID="c7ac119b3c0dd312e91ff7222fc753aae179ecc355acdab65f64e69cbba345d7" exitCode=0 Oct 03 13:06:09 crc kubenswrapper[4962]: I1003 13:06:09.809663 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" event={"ID":"f4861596-c6c9-4979-b4d8-fe9858724265","Type":"ContainerDied","Data":"c7ac119b3c0dd312e91ff7222fc753aae179ecc355acdab65f64e69cbba345d7"} Oct 03 13:06:11 crc kubenswrapper[4962]: I1003 13:06:11.820631 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4861596-c6c9-4979-b4d8-fe9858724265" containerID="593b0b57021ef3b653a99720dce30279f591996ac399cca953b938bbbdde2465" exitCode=0 Oct 03 13:06:11 crc kubenswrapper[4962]: I1003 13:06:11.820752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" event={"ID":"f4861596-c6c9-4979-b4d8-fe9858724265","Type":"ContainerDied","Data":"593b0b57021ef3b653a99720dce30279f591996ac399cca953b938bbbdde2465"} Oct 03 13:06:12 crc kubenswrapper[4962]: I1003 13:06:12.828252 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4861596-c6c9-4979-b4d8-fe9858724265" containerID="5a847058280dab86cac230e6673111619e9065f0c76d15ad53cd5c8476796590" exitCode=0 Oct 03 13:06:12 crc kubenswrapper[4962]: I1003 13:06:12.828296 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" event={"ID":"f4861596-c6c9-4979-b4d8-fe9858724265","Type":"ContainerDied","Data":"5a847058280dab86cac230e6673111619e9065f0c76d15ad53cd5c8476796590"} Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.072284 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.149369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-bundle\") pod \"f4861596-c6c9-4979-b4d8-fe9858724265\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.149488 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwnr5\" (UniqueName: \"kubernetes.io/projected/f4861596-c6c9-4979-b4d8-fe9858724265-kube-api-access-vwnr5\") pod \"f4861596-c6c9-4979-b4d8-fe9858724265\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.149505 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-util\") pod \"f4861596-c6c9-4979-b4d8-fe9858724265\" (UID: \"f4861596-c6c9-4979-b4d8-fe9858724265\") " Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.150616 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-bundle" (OuterVolumeSpecName: "bundle") pod "f4861596-c6c9-4979-b4d8-fe9858724265" (UID: "f4861596-c6c9-4979-b4d8-fe9858724265"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.154807 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4861596-c6c9-4979-b4d8-fe9858724265-kube-api-access-vwnr5" (OuterVolumeSpecName: "kube-api-access-vwnr5") pod "f4861596-c6c9-4979-b4d8-fe9858724265" (UID: "f4861596-c6c9-4979-b4d8-fe9858724265"). InnerVolumeSpecName "kube-api-access-vwnr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.163334 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-util" (OuterVolumeSpecName: "util") pod "f4861596-c6c9-4979-b4d8-fe9858724265" (UID: "f4861596-c6c9-4979-b4d8-fe9858724265"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.250463 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.250492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwnr5\" (UniqueName: \"kubernetes.io/projected/f4861596-c6c9-4979-b4d8-fe9858724265-kube-api-access-vwnr5\") on node \"crc\" DevicePath \"\"" Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.250502 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4861596-c6c9-4979-b4d8-fe9858724265-util\") on node \"crc\" DevicePath \"\"" Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.839330 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" event={"ID":"f4861596-c6c9-4979-b4d8-fe9858724265","Type":"ContainerDied","Data":"d2059c4a3dfe1f573e61a14fad9da8af72c065ebc6776ba90fc79d7f180d7a6f"} Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.839753 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2059c4a3dfe1f573e61a14fad9da8af72c065ebc6776ba90fc79d7f180d7a6f" Oct 03 13:06:14 crc kubenswrapper[4962]: I1003 13:06:14.839856 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.660846 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-klzjc"] Oct 03 13:06:19 crc kubenswrapper[4962]: E1003 13:06:19.661355 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4861596-c6c9-4979-b4d8-fe9858724265" containerName="pull" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.661366 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4861596-c6c9-4979-b4d8-fe9858724265" containerName="pull" Oct 03 13:06:19 crc kubenswrapper[4962]: E1003 13:06:19.661376 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4861596-c6c9-4979-b4d8-fe9858724265" containerName="extract" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.661382 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4861596-c6c9-4979-b4d8-fe9858724265" containerName="extract" Oct 03 13:06:19 crc kubenswrapper[4962]: E1003 13:06:19.661405 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4861596-c6c9-4979-b4d8-fe9858724265" containerName="util" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.661413 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4861596-c6c9-4979-b4d8-fe9858724265" containerName="util" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.661500 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4861596-c6c9-4979-b4d8-fe9858724265" containerName="extract" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.661857 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-klzjc" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.663784 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.663863 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gfwtt" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.663978 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.671306 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-klzjc"] Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.730369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4df4r\" (UniqueName: \"kubernetes.io/projected/608a97cf-8aef-44fd-aca4-57c6a896b7c8-kube-api-access-4df4r\") pod \"nmstate-operator-858ddd8f98-klzjc\" (UID: \"608a97cf-8aef-44fd-aca4-57c6a896b7c8\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-klzjc" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.832015 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4df4r\" (UniqueName: \"kubernetes.io/projected/608a97cf-8aef-44fd-aca4-57c6a896b7c8-kube-api-access-4df4r\") pod \"nmstate-operator-858ddd8f98-klzjc\" (UID: \"608a97cf-8aef-44fd-aca4-57c6a896b7c8\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-klzjc" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.850334 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4df4r\" (UniqueName: \"kubernetes.io/projected/608a97cf-8aef-44fd-aca4-57c6a896b7c8-kube-api-access-4df4r\") pod \"nmstate-operator-858ddd8f98-klzjc\" (UID: \"608a97cf-8aef-44fd-aca4-57c6a896b7c8\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-klzjc" Oct 03 13:06:19 crc kubenswrapper[4962]: I1003 13:06:19.981821 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-klzjc" Oct 03 13:06:20 crc kubenswrapper[4962]: I1003 13:06:20.166399 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-klzjc"] Oct 03 13:06:20 crc kubenswrapper[4962]: I1003 13:06:20.871363 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-klzjc" event={"ID":"608a97cf-8aef-44fd-aca4-57c6a896b7c8","Type":"ContainerStarted","Data":"efbe75ba6393be26ab19b599849b3d142eeff0f9fea3bc1b3bb4665b407aa8c0"} Oct 03 13:06:22 crc kubenswrapper[4962]: I1003 13:06:22.881601 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-klzjc" event={"ID":"608a97cf-8aef-44fd-aca4-57c6a896b7c8","Type":"ContainerStarted","Data":"6d9e4540dc319c640bdd15293fdac7eb1025a154db62f7329d546ddcee80a903"} Oct 03 13:06:22 crc kubenswrapper[4962]: I1003 13:06:22.898183 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-klzjc" podStartSLOduration=2.098860459 podStartE2EDuration="3.898169475s" podCreationTimestamp="2025-10-03 13:06:19 +0000 UTC" firstStartedPulling="2025-10-03 13:06:20.179840113 +0000 UTC m=+988.583737948" lastFinishedPulling="2025-10-03 13:06:21.979149129 +0000 UTC m=+990.383046964" observedRunningTime="2025-10-03 13:06:22.896594753 +0000 UTC m=+991.300492588" watchObservedRunningTime="2025-10-03 13:06:22.898169475 +0000 UTC m=+991.302067310" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.481786 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.483190 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.486232 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qc2kv" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.492965 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.511679 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.512530 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.515911 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.527244 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7wnjl"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.528524 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.543570 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-dbus-socket\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.543609 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dv49\" (UniqueName: \"kubernetes.io/projected/08ef24b1-9d00-4fa5-92e7-b28d3c1795c8-kube-api-access-7dv49\") pod \"nmstate-metrics-fdff9cb8d-rrlzt\" (UID: \"08ef24b1-9d00-4fa5-92e7-b28d3c1795c8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.543680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-nmstate-lock\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.543755 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-ovs-socket\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.543809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbkm5\" (UniqueName: \"kubernetes.io/projected/b7a87281-61a2-478d-8ddf-faa13fcd21e4-kube-api-access-gbkm5\") pod \"nmstate-webhook-6cdbc54649-xmr88\" (UID: \"b7a87281-61a2-478d-8ddf-faa13fcd21e4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.543874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b7a87281-61a2-478d-8ddf-faa13fcd21e4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-xmr88\" (UID: \"b7a87281-61a2-478d-8ddf-faa13fcd21e4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.543896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p4vw\" (UniqueName: \"kubernetes.io/projected/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-kube-api-access-8p4vw\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.578524 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b7a87281-61a2-478d-8ddf-faa13fcd21e4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-xmr88\" (UID: \"b7a87281-61a2-478d-8ddf-faa13fcd21e4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645448 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p4vw\" (UniqueName: \"kubernetes.io/projected/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-kube-api-access-8p4vw\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645509 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-dbus-socket\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645540 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dv49\" (UniqueName: \"kubernetes.io/projected/08ef24b1-9d00-4fa5-92e7-b28d3c1795c8-kube-api-access-7dv49\") pod \"nmstate-metrics-fdff9cb8d-rrlzt\" (UID: \"08ef24b1-9d00-4fa5-92e7-b28d3c1795c8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-nmstate-lock\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-ovs-socket\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbkm5\" (UniqueName: \"kubernetes.io/projected/b7a87281-61a2-478d-8ddf-faa13fcd21e4-kube-api-access-gbkm5\") pod \"nmstate-webhook-6cdbc54649-xmr88\" (UID: \"b7a87281-61a2-478d-8ddf-faa13fcd21e4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:28 crc kubenswrapper[4962]: E1003 13:06:28.645727 4962 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645758 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-nmstate-lock\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-ovs-socket\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: E1003 13:06:28.645869 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7a87281-61a2-478d-8ddf-faa13fcd21e4-tls-key-pair podName:b7a87281-61a2-478d-8ddf-faa13fcd21e4 nodeName:}" failed. No retries permitted until 2025-10-03 13:06:29.145830416 +0000 UTC m=+997.549728421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/b7a87281-61a2-478d-8ddf-faa13fcd21e4-tls-key-pair") pod "nmstate-webhook-6cdbc54649-xmr88" (UID: "b7a87281-61a2-478d-8ddf-faa13fcd21e4") : secret "openshift-nmstate-webhook" not found Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.645872 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-dbus-socket\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.670534 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.672475 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.674814 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p4vw\" (UniqueName: \"kubernetes.io/projected/ce2302c0-a558-4b5e-bf69-40c48e9c4e31-kube-api-access-8p4vw\") pod \"nmstate-handler-7wnjl\" (UID: \"ce2302c0-a558-4b5e-bf69-40c48e9c4e31\") " pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.675110 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dv49\" (UniqueName: \"kubernetes.io/projected/08ef24b1-9d00-4fa5-92e7-b28d3c1795c8-kube-api-access-7dv49\") pod \"nmstate-metrics-fdff9cb8d-rrlzt\" (UID: \"08ef24b1-9d00-4fa5-92e7-b28d3c1795c8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.676177 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.677907 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbkm5\" (UniqueName: \"kubernetes.io/projected/b7a87281-61a2-478d-8ddf-faa13fcd21e4-kube-api-access-gbkm5\") pod \"nmstate-webhook-6cdbc54649-xmr88\" (UID: \"b7a87281-61a2-478d-8ddf-faa13fcd21e4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.680319 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.680470 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-sz55n" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.680537 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.747093 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbjvd\" (UniqueName: \"kubernetes.io/projected/f87d275d-37cc-496a-bb07-d50b2905a494-kube-api-access-zbjvd\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.747169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f87d275d-37cc-496a-bb07-d50b2905a494-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.747195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87d275d-37cc-496a-bb07-d50b2905a494-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.799304 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.848589 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.848605 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbjvd\" (UniqueName: \"kubernetes.io/projected/f87d275d-37cc-496a-bb07-d50b2905a494-kube-api-access-zbjvd\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.848726 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f87d275d-37cc-496a-bb07-d50b2905a494-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.848751 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87d275d-37cc-496a-bb07-d50b2905a494-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: E1003 13:06:28.848939 4962 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 03 13:06:28 crc kubenswrapper[4962]: E1003 13:06:28.849025 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87d275d-37cc-496a-bb07-d50b2905a494-plugin-serving-cert podName:f87d275d-37cc-496a-bb07-d50b2905a494 nodeName:}" failed. No retries permitted until 2025-10-03 13:06:29.349004623 +0000 UTC m=+997.752902458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/f87d275d-37cc-496a-bb07-d50b2905a494-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-fzscf" (UID: "f87d275d-37cc-496a-bb07-d50b2905a494") : secret "plugin-serving-cert" not found Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.850536 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f87d275d-37cc-496a-bb07-d50b2905a494-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.864892 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fd7864499-w5mxx"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.865920 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.882080 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd7864499-w5mxx"] Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.889425 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbjvd\" (UniqueName: \"kubernetes.io/projected/f87d275d-37cc-496a-bb07-d50b2905a494-kube-api-access-zbjvd\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.950353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-oauth-serving-cert\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.950423 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-config\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.950466 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-serving-cert\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.950492 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwkp\" (UniqueName: \"kubernetes.io/projected/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-kube-api-access-mlwkp\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.950515 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-oauth-config\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.950600 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-trusted-ca-bundle\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.950709 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-service-ca\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:28 crc kubenswrapper[4962]: I1003 13:06:28.951182 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7wnjl" event={"ID":"ce2302c0-a558-4b5e-bf69-40c48e9c4e31","Type":"ContainerStarted","Data":"b90767570ab2292353e16c670fa2691f1e04e74b671f9e36c9f3e03a71ea0552"} Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.051984 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-oauth-serving-cert\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.052057 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-config\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.052085 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-serving-cert\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.052105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwkp\" (UniqueName: \"kubernetes.io/projected/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-kube-api-access-mlwkp\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.052131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-oauth-config\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.052156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-trusted-ca-bundle\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.052195 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-service-ca\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.053068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-oauth-serving-cert\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.053112 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-service-ca\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.053890 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-config\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.054468 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt"] Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.055127 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-trusted-ca-bundle\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.058216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-oauth-config\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.058298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-console-serving-cert\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: W1003 13:06:29.064302 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08ef24b1_9d00_4fa5_92e7_b28d3c1795c8.slice/crio-8e50f6ac23d8f2cad7d0082f32ed46b6e99c9c4bdac283fb307432d83e30ce9b WatchSource:0}: Error finding container 8e50f6ac23d8f2cad7d0082f32ed46b6e99c9c4bdac283fb307432d83e30ce9b: Status 404 returned error can't find the container with id 8e50f6ac23d8f2cad7d0082f32ed46b6e99c9c4bdac283fb307432d83e30ce9b Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.072514 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwkp\" (UniqueName: \"kubernetes.io/projected/a75ac5ce-b1df-4ed3-8edd-935dec71ee95-kube-api-access-mlwkp\") pod \"console-6fd7864499-w5mxx\" (UID: \"a75ac5ce-b1df-4ed3-8edd-935dec71ee95\") " pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.153215 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b7a87281-61a2-478d-8ddf-faa13fcd21e4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-xmr88\" (UID: \"b7a87281-61a2-478d-8ddf-faa13fcd21e4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.156291 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b7a87281-61a2-478d-8ddf-faa13fcd21e4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-xmr88\" (UID: \"b7a87281-61a2-478d-8ddf-faa13fcd21e4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.215163 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.355609 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87d275d-37cc-496a-bb07-d50b2905a494-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.361329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f87d275d-37cc-496a-bb07-d50b2905a494-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fzscf\" (UID: \"f87d275d-37cc-496a-bb07-d50b2905a494\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.441030 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.618275 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88"] Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.618582 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" Oct 03 13:06:29 crc kubenswrapper[4962]: W1003 13:06:29.622316 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a87281_61a2_478d_8ddf_faa13fcd21e4.slice/crio-bb8dd4f60826993169b25788cdd2cbeb1fc96d81d58afcb93a62f79e5c85d4fe WatchSource:0}: Error finding container bb8dd4f60826993169b25788cdd2cbeb1fc96d81d58afcb93a62f79e5c85d4fe: Status 404 returned error can't find the container with id bb8dd4f60826993169b25788cdd2cbeb1fc96d81d58afcb93a62f79e5c85d4fe Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.682972 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd7864499-w5mxx"] Oct 03 13:06:29 crc kubenswrapper[4962]: W1003 13:06:29.683774 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75ac5ce_b1df_4ed3_8edd_935dec71ee95.slice/crio-bd9082345a1b622df855d230676b9b5417c935fcc1223feb7ea2b03e49fbe3aa WatchSource:0}: Error finding container bd9082345a1b622df855d230676b9b5417c935fcc1223feb7ea2b03e49fbe3aa: Status 404 returned error can't find the container with id bd9082345a1b622df855d230676b9b5417c935fcc1223feb7ea2b03e49fbe3aa Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.966474 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" event={"ID":"b7a87281-61a2-478d-8ddf-faa13fcd21e4","Type":"ContainerStarted","Data":"bb8dd4f60826993169b25788cdd2cbeb1fc96d81d58afcb93a62f79e5c85d4fe"} Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.968466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" event={"ID":"08ef24b1-9d00-4fa5-92e7-b28d3c1795c8","Type":"ContainerStarted","Data":"8e50f6ac23d8f2cad7d0082f32ed46b6e99c9c4bdac283fb307432d83e30ce9b"} Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.970189 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd7864499-w5mxx" event={"ID":"a75ac5ce-b1df-4ed3-8edd-935dec71ee95","Type":"ContainerStarted","Data":"189abe78ce2264ac3ba1a0938449aac4788a415a80e6bea37b6b53374f86e636"} Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.970215 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd7864499-w5mxx" event={"ID":"a75ac5ce-b1df-4ed3-8edd-935dec71ee95","Type":"ContainerStarted","Data":"bd9082345a1b622df855d230676b9b5417c935fcc1223feb7ea2b03e49fbe3aa"} Oct 03 13:06:29 crc kubenswrapper[4962]: I1003 13:06:29.990111 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fd7864499-w5mxx" podStartSLOduration=1.9900973899999999 podStartE2EDuration="1.99009739s" podCreationTimestamp="2025-10-03 13:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:06:29.989588696 +0000 UTC m=+998.393486541" watchObservedRunningTime="2025-10-03 13:06:29.99009739 +0000 UTC m=+998.393995225" Oct 03 13:06:30 crc kubenswrapper[4962]: I1003 13:06:30.036902 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf"] Oct 03 13:06:30 crc kubenswrapper[4962]: I1003 13:06:30.980580 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" event={"ID":"f87d275d-37cc-496a-bb07-d50b2905a494","Type":"ContainerStarted","Data":"fa2a81752d3557d9d322dcb935ad5f6cc5bd225926afdc79a36efe80c69544e5"} Oct 03 13:06:31 crc kubenswrapper[4962]: I1003 13:06:31.988239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" event={"ID":"b7a87281-61a2-478d-8ddf-faa13fcd21e4","Type":"ContainerStarted","Data":"9e6c700fa657581c7fee10ca17343bd29961017157e7b05fb3a40afda870394f"} Oct 03 13:06:31 crc kubenswrapper[4962]: I1003 13:06:31.989114 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:06:31 crc kubenswrapper[4962]: I1003 13:06:31.991671 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7wnjl" event={"ID":"ce2302c0-a558-4b5e-bf69-40c48e9c4e31","Type":"ContainerStarted","Data":"ed0ca86ccf3502b0eb419aab4fefb5858ef45926b46a46f32c8f1b02ebde5cd0"} Oct 03 13:06:31 crc kubenswrapper[4962]: I1003 13:06:31.991759 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:31 crc kubenswrapper[4962]: I1003 13:06:31.993111 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" event={"ID":"08ef24b1-9d00-4fa5-92e7-b28d3c1795c8","Type":"ContainerStarted","Data":"acdacd75e4b6ca59e95e3f974b6cf1be3c81e1bfd4e8cd78b962fb2437dc95e9"} Oct 03 13:06:32 crc kubenswrapper[4962]: I1003 13:06:32.013289 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" podStartSLOduration=2.255821016 podStartE2EDuration="4.013256715s" podCreationTimestamp="2025-10-03 13:06:28 +0000 UTC" firstStartedPulling="2025-10-03 13:06:29.628013839 +0000 UTC m=+998.031911674" lastFinishedPulling="2025-10-03 13:06:31.385449538 +0000 UTC m=+999.789347373" observedRunningTime="2025-10-03 13:06:32.006218937 +0000 UTC m=+1000.410116792" watchObservedRunningTime="2025-10-03 13:06:32.013256715 +0000 UTC m=+1000.417154550" Oct 03 13:06:32 crc kubenswrapper[4962]: I1003 13:06:32.028665 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7wnjl" podStartSLOduration=1.67076732 podStartE2EDuration="4.028629605s" podCreationTimestamp="2025-10-03 13:06:28 +0000 UTC" firstStartedPulling="2025-10-03 13:06:28.947216376 +0000 UTC m=+997.351114211" lastFinishedPulling="2025-10-03 13:06:31.305078621 +0000 UTC m=+999.708976496" observedRunningTime="2025-10-03 13:06:32.025338808 +0000 UTC m=+1000.429236643" watchObservedRunningTime="2025-10-03 13:06:32.028629605 +0000 UTC m=+1000.432527440" Oct 03 13:06:33 crc kubenswrapper[4962]: I1003 13:06:33.008338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" event={"ID":"f87d275d-37cc-496a-bb07-d50b2905a494","Type":"ContainerStarted","Data":"1ba9deb771685701155586311ec1e4ec8af9a5daa5806a2f3116c895b3abc80c"} Oct 03 13:06:33 crc kubenswrapper[4962]: I1003 13:06:33.028729 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fzscf" podStartSLOduration=2.858559965 podStartE2EDuration="5.028705576s" podCreationTimestamp="2025-10-03 13:06:28 +0000 UTC" firstStartedPulling="2025-10-03 13:06:30.050449122 +0000 UTC m=+998.454346967" lastFinishedPulling="2025-10-03 13:06:32.220594743 +0000 UTC m=+1000.624492578" observedRunningTime="2025-10-03 13:06:33.022444469 +0000 UTC m=+1001.426342304" watchObservedRunningTime="2025-10-03 13:06:33.028705576 +0000 UTC m=+1001.432603411" Oct 03 13:06:34 crc kubenswrapper[4962]: I1003 13:06:34.015383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" event={"ID":"08ef24b1-9d00-4fa5-92e7-b28d3c1795c8","Type":"ContainerStarted","Data":"9a4be584e6aea3fa2908ca7dc3afc6d51457501cf86121c2b9a60a23114b76ed"} Oct 03 13:06:34 crc kubenswrapper[4962]: I1003 13:06:34.035160 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rrlzt" podStartSLOduration=1.636633379 podStartE2EDuration="6.035141237s" podCreationTimestamp="2025-10-03 13:06:28 +0000 UTC" firstStartedPulling="2025-10-03 13:06:29.067115648 +0000 UTC m=+997.471013483" lastFinishedPulling="2025-10-03 13:06:33.465623506 +0000 UTC m=+1001.869521341" observedRunningTime="2025-10-03 13:06:34.031349056 +0000 UTC m=+1002.435246911" watchObservedRunningTime="2025-10-03 13:06:34.035141237 +0000 UTC m=+1002.439039082" Oct 03 13:06:38 crc kubenswrapper[4962]: I1003 13:06:38.875820 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7wnjl" Oct 03 13:06:39 crc kubenswrapper[4962]: I1003 13:06:39.215908 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:39 crc kubenswrapper[4962]: I1003 13:06:39.215967 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:39 crc kubenswrapper[4962]: I1003 13:06:39.220733 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:40 crc kubenswrapper[4962]: I1003 13:06:40.056266 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fd7864499-w5mxx" Oct 03 13:06:40 crc kubenswrapper[4962]: I1003 13:06:40.112497 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wgl5v"] Oct 03 13:06:49 crc kubenswrapper[4962]: I1003 13:06:49.446988 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-xmr88" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.453405 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4"] Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.455137 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.457510 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.461628 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4"] Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.580030 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfgv6\" (UniqueName: \"kubernetes.io/projected/eba30427-49e1-469f-8d80-bacb89f94b5a-kube-api-access-tfgv6\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.580101 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.580124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.680952 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfgv6\" (UniqueName: \"kubernetes.io/projected/eba30427-49e1-469f-8d80-bacb89f94b5a-kube-api-access-tfgv6\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.681013 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.681036 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.681489 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.681593 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.708730 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfgv6\" (UniqueName: \"kubernetes.io/projected/eba30427-49e1-469f-8d80-bacb89f94b5a-kube-api-access-tfgv6\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.775727 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:03 crc kubenswrapper[4962]: I1003 13:07:03.966233 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4"] Oct 03 13:07:04 crc kubenswrapper[4962]: I1003 13:07:04.199328 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" event={"ID":"eba30427-49e1-469f-8d80-bacb89f94b5a","Type":"ContainerStarted","Data":"584839b0e6b73aaf0bb9bb438e9bde097321535b79adb8dd8c1fa444c7a6fa9c"} Oct 03 13:07:04 crc kubenswrapper[4962]: I1003 13:07:04.199380 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" event={"ID":"eba30427-49e1-469f-8d80-bacb89f94b5a","Type":"ContainerStarted","Data":"f70543ccc1f969f5720ef759e9552243c3246528d3e9d729e4bd698e736111e5"} Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.154089 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wgl5v" podUID="796d19ea-1d92-4dcb-9e10-305ddbe1b283" containerName="console" containerID="cri-o://0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8" gracePeriod=15 Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.206033 4962 generic.go:334] "Generic (PLEG): container finished" podID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerID="584839b0e6b73aaf0bb9bb438e9bde097321535b79adb8dd8c1fa444c7a6fa9c" exitCode=0 Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.206077 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" event={"ID":"eba30427-49e1-469f-8d80-bacb89f94b5a","Type":"ContainerDied","Data":"584839b0e6b73aaf0bb9bb438e9bde097321535b79adb8dd8c1fa444c7a6fa9c"} Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.479872 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wgl5v_796d19ea-1d92-4dcb-9e10-305ddbe1b283/console/0.log" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.480172 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.605220 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mskz\" (UniqueName: \"kubernetes.io/projected/796d19ea-1d92-4dcb-9e10-305ddbe1b283-kube-api-access-7mskz\") pod \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.605583 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-serving-cert\") pod \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.606366 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-oauth-config\") pod \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.606456 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-trusted-ca-bundle\") pod \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.606487 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-config\") pod \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.606515 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-oauth-serving-cert\") pod \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.606542 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-service-ca\") pod \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\" (UID: \"796d19ea-1d92-4dcb-9e10-305ddbe1b283\") " Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.607223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "796d19ea-1d92-4dcb-9e10-305ddbe1b283" (UID: "796d19ea-1d92-4dcb-9e10-305ddbe1b283"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.607256 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-config" (OuterVolumeSpecName: "console-config") pod "796d19ea-1d92-4dcb-9e10-305ddbe1b283" (UID: "796d19ea-1d92-4dcb-9e10-305ddbe1b283"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.607281 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "796d19ea-1d92-4dcb-9e10-305ddbe1b283" (UID: "796d19ea-1d92-4dcb-9e10-305ddbe1b283"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.607314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-service-ca" (OuterVolumeSpecName: "service-ca") pod "796d19ea-1d92-4dcb-9e10-305ddbe1b283" (UID: "796d19ea-1d92-4dcb-9e10-305ddbe1b283"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.611810 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/796d19ea-1d92-4dcb-9e10-305ddbe1b283-kube-api-access-7mskz" (OuterVolumeSpecName: "kube-api-access-7mskz") pod "796d19ea-1d92-4dcb-9e10-305ddbe1b283" (UID: "796d19ea-1d92-4dcb-9e10-305ddbe1b283"). InnerVolumeSpecName "kube-api-access-7mskz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.612300 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "796d19ea-1d92-4dcb-9e10-305ddbe1b283" (UID: "796d19ea-1d92-4dcb-9e10-305ddbe1b283"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.618121 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "796d19ea-1d92-4dcb-9e10-305ddbe1b283" (UID: "796d19ea-1d92-4dcb-9e10-305ddbe1b283"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.708011 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.708049 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.708058 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.708065 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/796d19ea-1d92-4dcb-9e10-305ddbe1b283-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.708074 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mskz\" (UniqueName: \"kubernetes.io/projected/796d19ea-1d92-4dcb-9e10-305ddbe1b283-kube-api-access-7mskz\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.708084 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:05 crc kubenswrapper[4962]: I1003 13:07:05.708092 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/796d19ea-1d92-4dcb-9e10-305ddbe1b283-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.212802 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wgl5v_796d19ea-1d92-4dcb-9e10-305ddbe1b283/console/0.log" Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.212862 4962 generic.go:334] "Generic (PLEG): container finished" podID="796d19ea-1d92-4dcb-9e10-305ddbe1b283" containerID="0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8" exitCode=2 Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.212899 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wgl5v" event={"ID":"796d19ea-1d92-4dcb-9e10-305ddbe1b283","Type":"ContainerDied","Data":"0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8"} Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.212939 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wgl5v" event={"ID":"796d19ea-1d92-4dcb-9e10-305ddbe1b283","Type":"ContainerDied","Data":"b39998ea4c7a9f15afd415f99d1994c5c178eb3bce843b09b94dc9f71581ed88"} Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.212949 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wgl5v" Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.212963 4962 scope.go:117] "RemoveContainer" containerID="0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8" Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.230994 4962 scope.go:117] "RemoveContainer" containerID="0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8" Oct 03 13:07:06 crc kubenswrapper[4962]: E1003 13:07:06.231724 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8\": container with ID starting with 0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8 not found: ID does not exist" containerID="0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8" Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.231866 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8"} err="failed to get container status \"0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8\": rpc error: code = NotFound desc = could not find container \"0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8\": container with ID starting with 0e0828f0e2c4ceb4ef205cd46fcd15c3cf85f7e0bdacfdafa36921b76ec186d8 not found: ID does not exist" Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.245423 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wgl5v"] Oct 03 13:07:06 crc kubenswrapper[4962]: I1003 13:07:06.247009 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wgl5v"] Oct 03 13:07:08 crc kubenswrapper[4962]: I1003 13:07:08.229817 4962 generic.go:334] "Generic (PLEG): container finished" podID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerID="71572c2fb69338e84b1543e4f6c28c9638b1f238b583bdfb5bdbcc09ca682607" exitCode=0 Oct 03 13:07:08 crc kubenswrapper[4962]: I1003 13:07:08.233471 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="796d19ea-1d92-4dcb-9e10-305ddbe1b283" path="/var/lib/kubelet/pods/796d19ea-1d92-4dcb-9e10-305ddbe1b283/volumes" Oct 03 13:07:08 crc kubenswrapper[4962]: I1003 13:07:08.234124 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" event={"ID":"eba30427-49e1-469f-8d80-bacb89f94b5a","Type":"ContainerDied","Data":"71572c2fb69338e84b1543e4f6c28c9638b1f238b583bdfb5bdbcc09ca682607"} Oct 03 13:07:09 crc kubenswrapper[4962]: I1003 13:07:09.237838 4962 generic.go:334] "Generic (PLEG): container finished" podID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerID="c18bf275298bd7430ff0d6f9493bcce985ca3b6e712692d6ff3791bfec8bc2e6" exitCode=0 Oct 03 13:07:09 crc kubenswrapper[4962]: I1003 13:07:09.237962 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" event={"ID":"eba30427-49e1-469f-8d80-bacb89f94b5a","Type":"ContainerDied","Data":"c18bf275298bd7430ff0d6f9493bcce985ca3b6e712692d6ff3791bfec8bc2e6"} Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.479036 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.562122 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-bundle\") pod \"eba30427-49e1-469f-8d80-bacb89f94b5a\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.562231 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-util\") pod \"eba30427-49e1-469f-8d80-bacb89f94b5a\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.562259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfgv6\" (UniqueName: \"kubernetes.io/projected/eba30427-49e1-469f-8d80-bacb89f94b5a-kube-api-access-tfgv6\") pod \"eba30427-49e1-469f-8d80-bacb89f94b5a\" (UID: \"eba30427-49e1-469f-8d80-bacb89f94b5a\") " Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.563437 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-bundle" (OuterVolumeSpecName: "bundle") pod "eba30427-49e1-469f-8d80-bacb89f94b5a" (UID: "eba30427-49e1-469f-8d80-bacb89f94b5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.567628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba30427-49e1-469f-8d80-bacb89f94b5a-kube-api-access-tfgv6" (OuterVolumeSpecName: "kube-api-access-tfgv6") pod "eba30427-49e1-469f-8d80-bacb89f94b5a" (UID: "eba30427-49e1-469f-8d80-bacb89f94b5a"). InnerVolumeSpecName "kube-api-access-tfgv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.572524 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-util" (OuterVolumeSpecName: "util") pod "eba30427-49e1-469f-8d80-bacb89f94b5a" (UID: "eba30427-49e1-469f-8d80-bacb89f94b5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.663861 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.663924 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eba30427-49e1-469f-8d80-bacb89f94b5a-util\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:10 crc kubenswrapper[4962]: I1003 13:07:10.663934 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfgv6\" (UniqueName: \"kubernetes.io/projected/eba30427-49e1-469f-8d80-bacb89f94b5a-kube-api-access-tfgv6\") on node \"crc\" DevicePath \"\"" Oct 03 13:07:11 crc kubenswrapper[4962]: I1003 13:07:11.248131 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" event={"ID":"eba30427-49e1-469f-8d80-bacb89f94b5a","Type":"ContainerDied","Data":"f70543ccc1f969f5720ef759e9552243c3246528d3e9d729e4bd698e736111e5"} Oct 03 13:07:11 crc kubenswrapper[4962]: I1003 13:07:11.248167 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f70543ccc1f969f5720ef759e9552243c3246528d3e9d729e4bd698e736111e5" Oct 03 13:07:11 crc kubenswrapper[4962]: I1003 13:07:11.248249 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.083294 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl"] Oct 03 13:07:21 crc kubenswrapper[4962]: E1003 13:07:21.084098 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796d19ea-1d92-4dcb-9e10-305ddbe1b283" containerName="console" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.084109 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="796d19ea-1d92-4dcb-9e10-305ddbe1b283" containerName="console" Oct 03 13:07:21 crc kubenswrapper[4962]: E1003 13:07:21.084128 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerName="pull" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.084133 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerName="pull" Oct 03 13:07:21 crc kubenswrapper[4962]: E1003 13:07:21.084143 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerName="extract" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.084149 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerName="extract" Oct 03 13:07:21 crc kubenswrapper[4962]: E1003 13:07:21.084158 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerName="util" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.084164 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerName="util" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.084254 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="796d19ea-1d92-4dcb-9e10-305ddbe1b283" containerName="console" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.084267 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba30427-49e1-469f-8d80-bacb89f94b5a" containerName="extract" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.084662 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.087283 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.087329 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.089473 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.089596 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-89kds" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.089801 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.105274 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl"] Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.188609 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/513e7194-e391-4d8d-bb6a-520346bc5aa1-apiservice-cert\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.188697 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzpf\" (UniqueName: \"kubernetes.io/projected/513e7194-e391-4d8d-bb6a-520346bc5aa1-kube-api-access-pmzpf\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.188744 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/513e7194-e391-4d8d-bb6a-520346bc5aa1-webhook-cert\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.290006 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/513e7194-e391-4d8d-bb6a-520346bc5aa1-webhook-cert\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.290109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/513e7194-e391-4d8d-bb6a-520346bc5aa1-apiservice-cert\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.290170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzpf\" (UniqueName: \"kubernetes.io/projected/513e7194-e391-4d8d-bb6a-520346bc5aa1-kube-api-access-pmzpf\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.298606 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/513e7194-e391-4d8d-bb6a-520346bc5aa1-webhook-cert\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.298769 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/513e7194-e391-4d8d-bb6a-520346bc5aa1-apiservice-cert\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.309271 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzpf\" (UniqueName: \"kubernetes.io/projected/513e7194-e391-4d8d-bb6a-520346bc5aa1-kube-api-access-pmzpf\") pod \"metallb-operator-controller-manager-6b44475b5f-pk5nl\" (UID: \"513e7194-e391-4d8d-bb6a-520346bc5aa1\") " pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.329927 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc"] Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.330934 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.332656 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.332868 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-k894l" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.332931 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.343789 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc"] Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.393397 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c9f2a4e9-25bc-4236-a860-96878c0dddd3-apiservice-cert\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.393479 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgn8\" (UniqueName: \"kubernetes.io/projected/c9f2a4e9-25bc-4236-a860-96878c0dddd3-kube-api-access-clgn8\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.393533 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c9f2a4e9-25bc-4236-a860-96878c0dddd3-webhook-cert\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.403775 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.495032 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c9f2a4e9-25bc-4236-a860-96878c0dddd3-apiservice-cert\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.495105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clgn8\" (UniqueName: \"kubernetes.io/projected/c9f2a4e9-25bc-4236-a860-96878c0dddd3-kube-api-access-clgn8\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.495144 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c9f2a4e9-25bc-4236-a860-96878c0dddd3-webhook-cert\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.503114 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c9f2a4e9-25bc-4236-a860-96878c0dddd3-apiservice-cert\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.511622 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c9f2a4e9-25bc-4236-a860-96878c0dddd3-webhook-cert\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.523735 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgn8\" (UniqueName: \"kubernetes.io/projected/c9f2a4e9-25bc-4236-a860-96878c0dddd3-kube-api-access-clgn8\") pod \"metallb-operator-webhook-server-69d7579867-lwlnc\" (UID: \"c9f2a4e9-25bc-4236-a860-96878c0dddd3\") " pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.654520 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.740001 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl"] Oct 03 13:07:21 crc kubenswrapper[4962]: W1003 13:07:21.760524 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod513e7194_e391_4d8d_bb6a_520346bc5aa1.slice/crio-f9f31deaf14a74a648fe8dd348469960477a358e20910e7d615ffa7ef2a66fe3 WatchSource:0}: Error finding container f9f31deaf14a74a648fe8dd348469960477a358e20910e7d615ffa7ef2a66fe3: Status 404 returned error can't find the container with id f9f31deaf14a74a648fe8dd348469960477a358e20910e7d615ffa7ef2a66fe3 Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.761157 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:07:21 crc kubenswrapper[4962]: I1003 13:07:21.882334 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc"] Oct 03 13:07:21 crc kubenswrapper[4962]: W1003 13:07:21.891404 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9f2a4e9_25bc_4236_a860_96878c0dddd3.slice/crio-3c4f23ab9e2f3d4a34a4bd5ca7ea07609ea6bda50588880b26c4882ff275eeaf WatchSource:0}: Error finding container 3c4f23ab9e2f3d4a34a4bd5ca7ea07609ea6bda50588880b26c4882ff275eeaf: Status 404 returned error can't find the container with id 3c4f23ab9e2f3d4a34a4bd5ca7ea07609ea6bda50588880b26c4882ff275eeaf Oct 03 13:07:22 crc kubenswrapper[4962]: I1003 13:07:22.317434 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" event={"ID":"513e7194-e391-4d8d-bb6a-520346bc5aa1","Type":"ContainerStarted","Data":"f9f31deaf14a74a648fe8dd348469960477a358e20910e7d615ffa7ef2a66fe3"} Oct 03 13:07:22 crc kubenswrapper[4962]: I1003 13:07:22.324803 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" event={"ID":"c9f2a4e9-25bc-4236-a860-96878c0dddd3","Type":"ContainerStarted","Data":"3c4f23ab9e2f3d4a34a4bd5ca7ea07609ea6bda50588880b26c4882ff275eeaf"} Oct 03 13:07:24 crc kubenswrapper[4962]: I1003 13:07:24.660338 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:07:24 crc kubenswrapper[4962]: I1003 13:07:24.660706 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:07:26 crc kubenswrapper[4962]: I1003 13:07:26.345464 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" event={"ID":"513e7194-e391-4d8d-bb6a-520346bc5aa1","Type":"ContainerStarted","Data":"eafd4996f8ea99eeb8fc0cf9880b817cae0687dfaa2bafb1fafb1641a1209d62"} Oct 03 13:07:26 crc kubenswrapper[4962]: I1003 13:07:26.345892 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:07:26 crc kubenswrapper[4962]: I1003 13:07:26.363750 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" podStartSLOduration=0.957435207 podStartE2EDuration="5.363728971s" podCreationTimestamp="2025-10-03 13:07:21 +0000 UTC" firstStartedPulling="2025-10-03 13:07:21.760946849 +0000 UTC m=+1050.164844684" lastFinishedPulling="2025-10-03 13:07:26.167240623 +0000 UTC m=+1054.571138448" observedRunningTime="2025-10-03 13:07:26.36109519 +0000 UTC m=+1054.764993025" watchObservedRunningTime="2025-10-03 13:07:26.363728971 +0000 UTC m=+1054.767626816" Oct 03 13:07:37 crc kubenswrapper[4962]: I1003 13:07:37.419460 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" event={"ID":"c9f2a4e9-25bc-4236-a860-96878c0dddd3","Type":"ContainerStarted","Data":"44997f910b0f3359fcbd0168b3a83946a7f76aaed850eacf0c3e76763b96f4f1"} Oct 03 13:07:37 crc kubenswrapper[4962]: I1003 13:07:37.420225 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:51 crc kubenswrapper[4962]: I1003 13:07:51.687277 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" Oct 03 13:07:51 crc kubenswrapper[4962]: I1003 13:07:51.711126 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-69d7579867-lwlnc" podStartSLOduration=15.403091507 podStartE2EDuration="30.711109489s" podCreationTimestamp="2025-10-03 13:07:21 +0000 UTC" firstStartedPulling="2025-10-03 13:07:21.89467959 +0000 UTC m=+1050.298577425" lastFinishedPulling="2025-10-03 13:07:37.202697572 +0000 UTC m=+1065.606595407" observedRunningTime="2025-10-03 13:07:37.460360234 +0000 UTC m=+1065.864258069" watchObservedRunningTime="2025-10-03 13:07:51.711109489 +0000 UTC m=+1080.115007324" Oct 03 13:07:54 crc kubenswrapper[4962]: I1003 13:07:54.660257 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:07:54 crc kubenswrapper[4962]: I1003 13:07:54.660676 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:08:01 crc kubenswrapper[4962]: I1003 13:08:01.411084 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6b44475b5f-pk5nl" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.146854 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ftmjr"] Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.149969 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.162077 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.163596 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.163939 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-b567z" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.179483 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5"] Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.180378 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.182897 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.205832 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5"] Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp5jv\" (UniqueName: \"kubernetes.io/projected/ae7d2439-9f77-42b9-8b22-ca0980f9650d-kube-api-access-pp5jv\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209277 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-startup\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209360 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics-certs\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209431 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-sockets\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209460 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-reloader\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209547 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-conf\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa1a4478-7484-4891-9130-3454b87cb614-cert\") pod \"frr-k8s-webhook-server-64bf5d555-7t4l5\" (UID: \"aa1a4478-7484-4891-9130-3454b87cb614\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.209623 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk94f\" (UniqueName: \"kubernetes.io/projected/aa1a4478-7484-4891-9130-3454b87cb614-kube-api-access-kk94f\") pod \"frr-k8s-webhook-server-64bf5d555-7t4l5\" (UID: \"aa1a4478-7484-4891-9130-3454b87cb614\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp5jv\" (UniqueName: \"kubernetes.io/projected/ae7d2439-9f77-42b9-8b22-ca0980f9650d-kube-api-access-pp5jv\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-startup\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics-certs\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310603 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-sockets\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-reloader\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310670 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-conf\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310687 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa1a4478-7484-4891-9130-3454b87cb614-cert\") pod \"frr-k8s-webhook-server-64bf5d555-7t4l5\" (UID: \"aa1a4478-7484-4891-9130-3454b87cb614\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.310708 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk94f\" (UniqueName: \"kubernetes.io/projected/aa1a4478-7484-4891-9130-3454b87cb614-kube-api-access-kk94f\") pod \"frr-k8s-webhook-server-64bf5d555-7t4l5\" (UID: \"aa1a4478-7484-4891-9130-3454b87cb614\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:02 crc kubenswrapper[4962]: E1003 13:08:02.310750 4962 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 03 13:08:02 crc kubenswrapper[4962]: E1003 13:08:02.310829 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics-certs podName:ae7d2439-9f77-42b9-8b22-ca0980f9650d nodeName:}" failed. No retries permitted until 2025-10-03 13:08:02.810804241 +0000 UTC m=+1091.214702076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics-certs") pod "frr-k8s-ftmjr" (UID: "ae7d2439-9f77-42b9-8b22-ca0980f9650d") : secret "frr-k8s-certs-secret" not found Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.311181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-conf\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.311263 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-reloader\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.311300 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.311390 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-sockets\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.311760 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae7d2439-9f77-42b9-8b22-ca0980f9650d-frr-startup\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.333747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa1a4478-7484-4891-9130-3454b87cb614-cert\") pod \"frr-k8s-webhook-server-64bf5d555-7t4l5\" (UID: \"aa1a4478-7484-4891-9130-3454b87cb614\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.351005 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-t5hhv"] Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.351839 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.357003 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk94f\" (UniqueName: \"kubernetes.io/projected/aa1a4478-7484-4891-9130-3454b87cb614-kube-api-access-kk94f\") pod \"frr-k8s-webhook-server-64bf5d555-7t4l5\" (UID: \"aa1a4478-7484-4891-9130-3454b87cb614\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.358378 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.360934 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.361213 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.364250 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-g62lr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.365350 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp5jv\" (UniqueName: \"kubernetes.io/projected/ae7d2439-9f77-42b9-8b22-ca0980f9650d-kube-api-access-pp5jv\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.372223 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-n7d8j"] Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.373293 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.377248 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.395612 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-n7d8j"] Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.411372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.411430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-cert\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.411510 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-metrics-certs\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.411540 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-642v7\" (UniqueName: \"kubernetes.io/projected/25aa504c-2ffc-462d-b138-c89c0f3083ce-kube-api-access-642v7\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.411596 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-metrics-certs\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.411703 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-metallb-excludel2\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.411797 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlkk\" (UniqueName: \"kubernetes.io/projected/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-kube-api-access-xjlkk\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.493904 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.513070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.513115 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-cert\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.513174 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-metrics-certs\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.513207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-642v7\" (UniqueName: \"kubernetes.io/projected/25aa504c-2ffc-462d-b138-c89c0f3083ce-kube-api-access-642v7\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.513232 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-metrics-certs\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.513268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-metallb-excludel2\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.513325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlkk\" (UniqueName: \"kubernetes.io/projected/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-kube-api-access-xjlkk\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: E1003 13:08:02.513715 4962 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 13:08:02 crc kubenswrapper[4962]: E1003 13:08:02.513763 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist podName:b4f8f5c2-fdb9-4905-8173-c6c709d8565f nodeName:}" failed. No retries permitted until 2025-10-03 13:08:03.013743581 +0000 UTC m=+1091.417641416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist") pod "speaker-t5hhv" (UID: "b4f8f5c2-fdb9-4905-8173-c6c709d8565f") : secret "metallb-memberlist" not found Oct 03 13:08:02 crc kubenswrapper[4962]: E1003 13:08:02.513972 4962 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 03 13:08:02 crc kubenswrapper[4962]: E1003 13:08:02.514010 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-metrics-certs podName:25aa504c-2ffc-462d-b138-c89c0f3083ce nodeName:}" failed. No retries permitted until 2025-10-03 13:08:03.013999918 +0000 UTC m=+1091.417897753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-metrics-certs") pod "controller-68d546b9d8-n7d8j" (UID: "25aa504c-2ffc-462d-b138-c89c0f3083ce") : secret "controller-certs-secret" not found Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.514628 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-metallb-excludel2\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.516816 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-metrics-certs\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.518555 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.527753 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-cert\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.535551 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlkk\" (UniqueName: \"kubernetes.io/projected/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-kube-api-access-xjlkk\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.541221 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-642v7\" (UniqueName: \"kubernetes.io/projected/25aa504c-2ffc-462d-b138-c89c0f3083ce-kube-api-access-642v7\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.703545 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5"] Oct 03 13:08:02 crc kubenswrapper[4962]: W1003 13:08:02.712784 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa1a4478_7484_4891_9130_3454b87cb614.slice/crio-3c30ff798b0c3c9c47f741bd4e3ececebb2de02365d2862e8dc87d57e7c4998f WatchSource:0}: Error finding container 3c30ff798b0c3c9c47f741bd4e3ececebb2de02365d2862e8dc87d57e7c4998f: Status 404 returned error can't find the container with id 3c30ff798b0c3c9c47f741bd4e3ececebb2de02365d2862e8dc87d57e7c4998f Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.816310 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics-certs\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:02 crc kubenswrapper[4962]: I1003 13:08:02.820147 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae7d2439-9f77-42b9-8b22-ca0980f9650d-metrics-certs\") pod \"frr-k8s-ftmjr\" (UID: \"ae7d2439-9f77-42b9-8b22-ca0980f9650d\") " pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.019141 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.019201 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-metrics-certs\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:03 crc kubenswrapper[4962]: E1003 13:08:03.019300 4962 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 13:08:03 crc kubenswrapper[4962]: E1003 13:08:03.019359 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist podName:b4f8f5c2-fdb9-4905-8173-c6c709d8565f nodeName:}" failed. No retries permitted until 2025-10-03 13:08:04.019342565 +0000 UTC m=+1092.423240400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist") pod "speaker-t5hhv" (UID: "b4f8f5c2-fdb9-4905-8173-c6c709d8565f") : secret "metallb-memberlist" not found Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.022945 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25aa504c-2ffc-462d-b138-c89c0f3083ce-metrics-certs\") pod \"controller-68d546b9d8-n7d8j\" (UID: \"25aa504c-2ffc-462d-b138-c89c0f3083ce\") " pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.024404 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.072454 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.198842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-n7d8j"] Oct 03 13:08:03 crc kubenswrapper[4962]: W1003 13:08:03.201003 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25aa504c_2ffc_462d_b138_c89c0f3083ce.slice/crio-d017e47a7fc8cf3ce52071456cf14923a1c963654778e6d4ee8c388a96d343ae WatchSource:0}: Error finding container d017e47a7fc8cf3ce52071456cf14923a1c963654778e6d4ee8c388a96d343ae: Status 404 returned error can't find the container with id d017e47a7fc8cf3ce52071456cf14923a1c963654778e6d4ee8c388a96d343ae Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.561285 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-n7d8j" event={"ID":"25aa504c-2ffc-462d-b138-c89c0f3083ce","Type":"ContainerStarted","Data":"a9f251098c6b63204dfd44b54cf2ef4683bbaa7245a5b4ff1bc0b474a61d7f2f"} Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.561333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-n7d8j" event={"ID":"25aa504c-2ffc-462d-b138-c89c0f3083ce","Type":"ContainerStarted","Data":"d017e47a7fc8cf3ce52071456cf14923a1c963654778e6d4ee8c388a96d343ae"} Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.562329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerStarted","Data":"87e8193c0eb058663a86ad71ebceff0990ac242b5898f2616207b131628a6f5d"} Oct 03 13:08:03 crc kubenswrapper[4962]: I1003 13:08:03.563535 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" event={"ID":"aa1a4478-7484-4891-9130-3454b87cb614","Type":"ContainerStarted","Data":"3c30ff798b0c3c9c47f741bd4e3ececebb2de02365d2862e8dc87d57e7c4998f"} Oct 03 13:08:04 crc kubenswrapper[4962]: I1003 13:08:04.031873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:04 crc kubenswrapper[4962]: I1003 13:08:04.040161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4f8f5c2-fdb9-4905-8173-c6c709d8565f-memberlist\") pod \"speaker-t5hhv\" (UID: \"b4f8f5c2-fdb9-4905-8173-c6c709d8565f\") " pod="metallb-system/speaker-t5hhv" Oct 03 13:08:04 crc kubenswrapper[4962]: I1003 13:08:04.218463 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-t5hhv" Oct 03 13:08:04 crc kubenswrapper[4962]: I1003 13:08:04.572683 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t5hhv" event={"ID":"b4f8f5c2-fdb9-4905-8173-c6c709d8565f","Type":"ContainerStarted","Data":"dd842dbcf9d74e54c5f4debd534cde327670a5cf493ae4fd1bd9e3e7238566b3"} Oct 03 13:08:04 crc kubenswrapper[4962]: I1003 13:08:04.572753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t5hhv" event={"ID":"b4f8f5c2-fdb9-4905-8173-c6c709d8565f","Type":"ContainerStarted","Data":"1755c35ad60ae8e5fc9ea7499b719ee57c76198570831c6c1c6e7468e7f5283f"} Oct 03 13:08:04 crc kubenswrapper[4962]: I1003 13:08:04.575135 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-n7d8j" event={"ID":"25aa504c-2ffc-462d-b138-c89c0f3083ce","Type":"ContainerStarted","Data":"ba3e1d749cbe2be9971fc496d3b8db9493e523954cd492b9c00ddd3d2a34aede"} Oct 03 13:08:04 crc kubenswrapper[4962]: I1003 13:08:04.575230 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:04 crc kubenswrapper[4962]: I1003 13:08:04.594321 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-n7d8j" podStartSLOduration=2.594306261 podStartE2EDuration="2.594306261s" podCreationTimestamp="2025-10-03 13:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:08:04.593204811 +0000 UTC m=+1092.997102646" watchObservedRunningTime="2025-10-03 13:08:04.594306261 +0000 UTC m=+1092.998204096" Oct 03 13:08:05 crc kubenswrapper[4962]: I1003 13:08:05.582785 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t5hhv" event={"ID":"b4f8f5c2-fdb9-4905-8173-c6c709d8565f","Type":"ContainerStarted","Data":"76404bdb854ac091fbe5c2fb6b07a97a4363cb192a1edf4c1f3cbc7d684825d8"} Oct 03 13:08:05 crc kubenswrapper[4962]: I1003 13:08:05.600579 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-t5hhv" podStartSLOduration=3.600561845 podStartE2EDuration="3.600561845s" podCreationTimestamp="2025-10-03 13:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:08:05.59997731 +0000 UTC m=+1094.003875155" watchObservedRunningTime="2025-10-03 13:08:05.600561845 +0000 UTC m=+1094.004459680" Oct 03 13:08:06 crc kubenswrapper[4962]: I1003 13:08:06.589470 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-t5hhv" Oct 03 13:08:10 crc kubenswrapper[4962]: I1003 13:08:10.614297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" event={"ID":"aa1a4478-7484-4891-9130-3454b87cb614","Type":"ContainerStarted","Data":"3e53b090432939bba1fcd68648357d5c02f499e3b6a01ca3b5ef69353e81311c"} Oct 03 13:08:10 crc kubenswrapper[4962]: I1003 13:08:10.614616 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:10 crc kubenswrapper[4962]: I1003 13:08:10.615595 4962 generic.go:334] "Generic (PLEG): container finished" podID="ae7d2439-9f77-42b9-8b22-ca0980f9650d" containerID="3bbc60e8f9700327b548e7f2a2b312c176619e6c0efe424363655e4de3d588ff" exitCode=0 Oct 03 13:08:10 crc kubenswrapper[4962]: I1003 13:08:10.615621 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerDied","Data":"3bbc60e8f9700327b548e7f2a2b312c176619e6c0efe424363655e4de3d588ff"} Oct 03 13:08:10 crc kubenswrapper[4962]: I1003 13:08:10.628934 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" podStartSLOduration=1.839027848 podStartE2EDuration="8.628917975s" podCreationTimestamp="2025-10-03 13:08:02 +0000 UTC" firstStartedPulling="2025-10-03 13:08:02.714160514 +0000 UTC m=+1091.118058349" lastFinishedPulling="2025-10-03 13:08:09.504050641 +0000 UTC m=+1097.907948476" observedRunningTime="2025-10-03 13:08:10.625964816 +0000 UTC m=+1099.029862641" watchObservedRunningTime="2025-10-03 13:08:10.628917975 +0000 UTC m=+1099.032815810" Oct 03 13:08:12 crc kubenswrapper[4962]: I1003 13:08:12.627887 4962 generic.go:334] "Generic (PLEG): container finished" podID="ae7d2439-9f77-42b9-8b22-ca0980f9650d" containerID="5ff6cb9def13743ad328ed9fe4755595bcefe2154c803e7eaf34b2dd6dd018ef" exitCode=0 Oct 03 13:08:12 crc kubenswrapper[4962]: I1003 13:08:12.627922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerDied","Data":"5ff6cb9def13743ad328ed9fe4755595bcefe2154c803e7eaf34b2dd6dd018ef"} Oct 03 13:08:13 crc kubenswrapper[4962]: I1003 13:08:13.030113 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-n7d8j" Oct 03 13:08:13 crc kubenswrapper[4962]: I1003 13:08:13.636299 4962 generic.go:334] "Generic (PLEG): container finished" podID="ae7d2439-9f77-42b9-8b22-ca0980f9650d" containerID="24d95598ec05f721860880bcb284bae064d67c590a8f7fd40a69eea8fc66bc24" exitCode=0 Oct 03 13:08:13 crc kubenswrapper[4962]: I1003 13:08:13.636340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerDied","Data":"24d95598ec05f721860880bcb284bae064d67c590a8f7fd40a69eea8fc66bc24"} Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.222174 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-t5hhv" Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.653812 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerStarted","Data":"8a0fba3e37d8baabfe819262971ad13854964a23216d6ceccb9c86f0fee52f36"} Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.654165 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerStarted","Data":"d871ec57b5b2207c4f55dbd489a3c1e0f17770e46ec642e15889ecb2b537bfed"} Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.654185 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerStarted","Data":"12b9d2ad639cf43637806e911b24bc7c8c111bffda0115c458af3ea9263c6aff"} Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.654197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerStarted","Data":"ce25aa4f47f221788a78fbd7f9c5344bd102470224e91b15a23294b841d75b11"} Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.654208 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerStarted","Data":"933dcd5defaf42365d3543f52f9646d8323733e481dfce8173c00746ad459f1a"} Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.654219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ftmjr" event={"ID":"ae7d2439-9f77-42b9-8b22-ca0980f9650d","Type":"ContainerStarted","Data":"2691cf0cdc49bbbedeb552539c813335744ce9f99ac091ff13196dc6037987ee"} Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.654856 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:14 crc kubenswrapper[4962]: I1003 13:08:14.686809 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ftmjr" podStartSLOduration=6.3399571120000004 podStartE2EDuration="12.686795885s" podCreationTimestamp="2025-10-03 13:08:02 +0000 UTC" firstStartedPulling="2025-10-03 13:08:03.184507487 +0000 UTC m=+1091.588405322" lastFinishedPulling="2025-10-03 13:08:09.53134626 +0000 UTC m=+1097.935244095" observedRunningTime="2025-10-03 13:08:14.685905101 +0000 UTC m=+1103.089802946" watchObservedRunningTime="2025-10-03 13:08:14.686795885 +0000 UTC m=+1103.090693720" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.679733 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz"] Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.682105 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.683937 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.695836 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz"] Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.705198 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.705278 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.705324 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khlq2\" (UniqueName: \"kubernetes.io/projected/1559ba35-102a-4157-8499-10dcbc2241d8-kube-api-access-khlq2\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.806221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.806557 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.806690 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khlq2\" (UniqueName: \"kubernetes.io/projected/1559ba35-102a-4157-8499-10dcbc2241d8-kube-api-access-khlq2\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.807243 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.807503 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:15 crc kubenswrapper[4962]: I1003 13:08:15.835973 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khlq2\" (UniqueName: \"kubernetes.io/projected/1559ba35-102a-4157-8499-10dcbc2241d8-kube-api-access-khlq2\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:16 crc kubenswrapper[4962]: I1003 13:08:16.037316 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:16 crc kubenswrapper[4962]: I1003 13:08:16.425256 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz"] Oct 03 13:08:16 crc kubenswrapper[4962]: I1003 13:08:16.666958 4962 generic.go:334] "Generic (PLEG): container finished" podID="1559ba35-102a-4157-8499-10dcbc2241d8" containerID="08762d7890d538d9516ccc800bfc599067947d1dc7bb09d0d3c4ea501987ded1" exitCode=0 Oct 03 13:08:16 crc kubenswrapper[4962]: I1003 13:08:16.667011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" event={"ID":"1559ba35-102a-4157-8499-10dcbc2241d8","Type":"ContainerDied","Data":"08762d7890d538d9516ccc800bfc599067947d1dc7bb09d0d3c4ea501987ded1"} Oct 03 13:08:16 crc kubenswrapper[4962]: I1003 13:08:16.667077 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" event={"ID":"1559ba35-102a-4157-8499-10dcbc2241d8","Type":"ContainerStarted","Data":"885d859817dc8f506e59fb324d0eb018225b3de5603fcd1b707e33124c169d8a"} Oct 03 13:08:18 crc kubenswrapper[4962]: I1003 13:08:18.073656 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:18 crc kubenswrapper[4962]: I1003 13:08:18.113758 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:20 crc kubenswrapper[4962]: I1003 13:08:20.706752 4962 generic.go:334] "Generic (PLEG): container finished" podID="1559ba35-102a-4157-8499-10dcbc2241d8" containerID="15a9374bb510a31202d780f127449347920b6a85bed9756a63c6a9560702b70b" exitCode=0 Oct 03 13:08:20 crc kubenswrapper[4962]: I1003 13:08:20.706856 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" event={"ID":"1559ba35-102a-4157-8499-10dcbc2241d8","Type":"ContainerDied","Data":"15a9374bb510a31202d780f127449347920b6a85bed9756a63c6a9560702b70b"} Oct 03 13:08:21 crc kubenswrapper[4962]: I1003 13:08:21.713479 4962 generic.go:334] "Generic (PLEG): container finished" podID="1559ba35-102a-4157-8499-10dcbc2241d8" containerID="60f9f42bdfd58759424e0dab6ff43a91e93768e302cfc7ffb56daa9e8977eb78" exitCode=0 Oct 03 13:08:21 crc kubenswrapper[4962]: I1003 13:08:21.713617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" event={"ID":"1559ba35-102a-4157-8499-10dcbc2241d8","Type":"ContainerDied","Data":"60f9f42bdfd58759424e0dab6ff43a91e93768e302cfc7ffb56daa9e8977eb78"} Oct 03 13:08:22 crc kubenswrapper[4962]: I1003 13:08:22.500929 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-7t4l5" Oct 03 13:08:22 crc kubenswrapper[4962]: I1003 13:08:22.953992 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:22 crc kubenswrapper[4962]: I1003 13:08:22.992699 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-util\") pod \"1559ba35-102a-4157-8499-10dcbc2241d8\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " Oct 03 13:08:22 crc kubenswrapper[4962]: I1003 13:08:22.992840 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-bundle\") pod \"1559ba35-102a-4157-8499-10dcbc2241d8\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " Oct 03 13:08:22 crc kubenswrapper[4962]: I1003 13:08:22.992881 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khlq2\" (UniqueName: \"kubernetes.io/projected/1559ba35-102a-4157-8499-10dcbc2241d8-kube-api-access-khlq2\") pod \"1559ba35-102a-4157-8499-10dcbc2241d8\" (UID: \"1559ba35-102a-4157-8499-10dcbc2241d8\") " Oct 03 13:08:22 crc kubenswrapper[4962]: I1003 13:08:22.994880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-bundle" (OuterVolumeSpecName: "bundle") pod "1559ba35-102a-4157-8499-10dcbc2241d8" (UID: "1559ba35-102a-4157-8499-10dcbc2241d8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:08:22 crc kubenswrapper[4962]: I1003 13:08:22.998011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1559ba35-102a-4157-8499-10dcbc2241d8-kube-api-access-khlq2" (OuterVolumeSpecName: "kube-api-access-khlq2") pod "1559ba35-102a-4157-8499-10dcbc2241d8" (UID: "1559ba35-102a-4157-8499-10dcbc2241d8"). InnerVolumeSpecName "kube-api-access-khlq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:08:23 crc kubenswrapper[4962]: I1003 13:08:23.002496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-util" (OuterVolumeSpecName: "util") pod "1559ba35-102a-4157-8499-10dcbc2241d8" (UID: "1559ba35-102a-4157-8499-10dcbc2241d8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:08:23 crc kubenswrapper[4962]: I1003 13:08:23.075170 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ftmjr" Oct 03 13:08:23 crc kubenswrapper[4962]: I1003 13:08:23.097288 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:08:23 crc kubenswrapper[4962]: I1003 13:08:23.097532 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khlq2\" (UniqueName: \"kubernetes.io/projected/1559ba35-102a-4157-8499-10dcbc2241d8-kube-api-access-khlq2\") on node \"crc\" DevicePath \"\"" Oct 03 13:08:23 crc kubenswrapper[4962]: I1003 13:08:23.097598 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1559ba35-102a-4157-8499-10dcbc2241d8-util\") on node \"crc\" DevicePath \"\"" Oct 03 13:08:23 crc kubenswrapper[4962]: I1003 13:08:23.726573 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" event={"ID":"1559ba35-102a-4157-8499-10dcbc2241d8","Type":"ContainerDied","Data":"885d859817dc8f506e59fb324d0eb018225b3de5603fcd1b707e33124c169d8a"} Oct 03 13:08:23 crc kubenswrapper[4962]: I1003 13:08:23.726669 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885d859817dc8f506e59fb324d0eb018225b3de5603fcd1b707e33124c169d8a" Oct 03 13:08:23 crc kubenswrapper[4962]: I1003 13:08:23.726965 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz" Oct 03 13:08:24 crc kubenswrapper[4962]: I1003 13:08:24.660155 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:08:24 crc kubenswrapper[4962]: I1003 13:08:24.660206 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:08:24 crc kubenswrapper[4962]: I1003 13:08:24.660253 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:08:24 crc kubenswrapper[4962]: I1003 13:08:24.660910 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a8662442e8f36173a3b3425f41847fc665cbcd80d634980f74f9a3c41a264cea"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:08:24 crc kubenswrapper[4962]: I1003 13:08:24.660968 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://a8662442e8f36173a3b3425f41847fc665cbcd80d634980f74f9a3c41a264cea" gracePeriod=600 Oct 03 13:08:25 crc kubenswrapper[4962]: I1003 13:08:25.745454 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="a8662442e8f36173a3b3425f41847fc665cbcd80d634980f74f9a3c41a264cea" exitCode=0 Oct 03 13:08:25 crc kubenswrapper[4962]: I1003 13:08:25.745518 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"a8662442e8f36173a3b3425f41847fc665cbcd80d634980f74f9a3c41a264cea"} Oct 03 13:08:25 crc kubenswrapper[4962]: I1003 13:08:25.746097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"ca8ebb170cb5bf8155325e7ea7c1aa3487bc412d9472e208bb48495e13806d06"} Oct 03 13:08:25 crc kubenswrapper[4962]: I1003 13:08:25.746118 4962 scope.go:117] "RemoveContainer" containerID="076795ece51777028d4903d02e01c23aad08fd0c510374d97a2d753df68d0eea" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.186646 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4"] Oct 03 13:08:34 crc kubenswrapper[4962]: E1003 13:08:34.187831 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1559ba35-102a-4157-8499-10dcbc2241d8" containerName="extract" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.187849 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1559ba35-102a-4157-8499-10dcbc2241d8" containerName="extract" Oct 03 13:08:34 crc kubenswrapper[4962]: E1003 13:08:34.187865 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1559ba35-102a-4157-8499-10dcbc2241d8" containerName="pull" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.187873 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1559ba35-102a-4157-8499-10dcbc2241d8" containerName="pull" Oct 03 13:08:34 crc kubenswrapper[4962]: E1003 13:08:34.187888 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1559ba35-102a-4157-8499-10dcbc2241d8" containerName="util" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.187896 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1559ba35-102a-4157-8499-10dcbc2241d8" containerName="util" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.188016 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1559ba35-102a-4157-8499-10dcbc2241d8" containerName="extract" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.188584 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.190717 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.190777 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-bksdj" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.190844 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.204813 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4"] Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.231682 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7h2t\" (UniqueName: \"kubernetes.io/projected/1588dc5b-09fc-4311-b5c9-2397f0f1ffd7-kube-api-access-l7h2t\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2bnl4\" (UID: \"1588dc5b-09fc-4311-b5c9-2397f0f1ffd7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.333161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7h2t\" (UniqueName: \"kubernetes.io/projected/1588dc5b-09fc-4311-b5c9-2397f0f1ffd7-kube-api-access-l7h2t\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2bnl4\" (UID: \"1588dc5b-09fc-4311-b5c9-2397f0f1ffd7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.354916 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7h2t\" (UniqueName: \"kubernetes.io/projected/1588dc5b-09fc-4311-b5c9-2397f0f1ffd7-kube-api-access-l7h2t\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2bnl4\" (UID: \"1588dc5b-09fc-4311-b5c9-2397f0f1ffd7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.510820 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4" Oct 03 13:08:34 crc kubenswrapper[4962]: I1003 13:08:34.913013 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4"] Oct 03 13:08:35 crc kubenswrapper[4962]: I1003 13:08:35.806402 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4" event={"ID":"1588dc5b-09fc-4311-b5c9-2397f0f1ffd7","Type":"ContainerStarted","Data":"9d43cb66ca335b9bfdf498fd036292b7df83753f38d85d16636facdc1d631f0f"} Oct 03 13:08:40 crc kubenswrapper[4962]: I1003 13:08:40.835036 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4" event={"ID":"1588dc5b-09fc-4311-b5c9-2397f0f1ffd7","Type":"ContainerStarted","Data":"aebc807b35a88048a284403f0dc8ab61aa96ec59dbd145b5a022d6e768407721"} Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.797730 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2bnl4" podStartSLOduration=5.153795663 podStartE2EDuration="10.797712441s" podCreationTimestamp="2025-10-03 13:08:34 +0000 UTC" firstStartedPulling="2025-10-03 13:08:34.927233922 +0000 UTC m=+1123.331131757" lastFinishedPulling="2025-10-03 13:08:40.57115071 +0000 UTC m=+1128.975048535" observedRunningTime="2025-10-03 13:08:40.852842727 +0000 UTC m=+1129.256740582" watchObservedRunningTime="2025-10-03 13:08:44.797712441 +0000 UTC m=+1133.201610286" Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.798577 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-sfwbb"] Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.799238 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.802627 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-978gw" Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.803263 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.803730 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.818855 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-sfwbb"] Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.966705 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3429db32-6f0c-4b9c-8c3f-fd257b45d362-bound-sa-token\") pod \"cert-manager-webhook-d969966f-sfwbb\" (UID: \"3429db32-6f0c-4b9c-8c3f-fd257b45d362\") " pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:44 crc kubenswrapper[4962]: I1003 13:08:44.966749 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmg6n\" (UniqueName: \"kubernetes.io/projected/3429db32-6f0c-4b9c-8c3f-fd257b45d362-kube-api-access-hmg6n\") pod \"cert-manager-webhook-d969966f-sfwbb\" (UID: \"3429db32-6f0c-4b9c-8c3f-fd257b45d362\") " pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:45 crc kubenswrapper[4962]: I1003 13:08:45.068225 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3429db32-6f0c-4b9c-8c3f-fd257b45d362-bound-sa-token\") pod \"cert-manager-webhook-d969966f-sfwbb\" (UID: \"3429db32-6f0c-4b9c-8c3f-fd257b45d362\") " pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:45 crc kubenswrapper[4962]: I1003 13:08:45.068282 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmg6n\" (UniqueName: \"kubernetes.io/projected/3429db32-6f0c-4b9c-8c3f-fd257b45d362-kube-api-access-hmg6n\") pod \"cert-manager-webhook-d969966f-sfwbb\" (UID: \"3429db32-6f0c-4b9c-8c3f-fd257b45d362\") " pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:45 crc kubenswrapper[4962]: I1003 13:08:45.088455 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3429db32-6f0c-4b9c-8c3f-fd257b45d362-bound-sa-token\") pod \"cert-manager-webhook-d969966f-sfwbb\" (UID: \"3429db32-6f0c-4b9c-8c3f-fd257b45d362\") " pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:45 crc kubenswrapper[4962]: I1003 13:08:45.088704 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmg6n\" (UniqueName: \"kubernetes.io/projected/3429db32-6f0c-4b9c-8c3f-fd257b45d362-kube-api-access-hmg6n\") pod \"cert-manager-webhook-d969966f-sfwbb\" (UID: \"3429db32-6f0c-4b9c-8c3f-fd257b45d362\") " pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:45 crc kubenswrapper[4962]: I1003 13:08:45.114326 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:45 crc kubenswrapper[4962]: I1003 13:08:45.350698 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-sfwbb"] Oct 03 13:08:45 crc kubenswrapper[4962]: W1003 13:08:45.360904 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3429db32_6f0c_4b9c_8c3f_fd257b45d362.slice/crio-f4cbf952b566de5d17d304277d0feed87818bee3e58851bc221e878841292501 WatchSource:0}: Error finding container f4cbf952b566de5d17d304277d0feed87818bee3e58851bc221e878841292501: Status 404 returned error can't find the container with id f4cbf952b566de5d17d304277d0feed87818bee3e58851bc221e878841292501 Oct 03 13:08:45 crc kubenswrapper[4962]: I1003 13:08:45.860900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" event={"ID":"3429db32-6f0c-4b9c-8c3f-fd257b45d362","Type":"ContainerStarted","Data":"f4cbf952b566de5d17d304277d0feed87818bee3e58851bc221e878841292501"} Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.406517 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk"] Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.407399 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.410672 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-ggzk6" Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.448184 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk"] Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.586986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngb9q\" (UniqueName: \"kubernetes.io/projected/fcf33643-9678-4f5e-958a-a1df723d0497-kube-api-access-ngb9q\") pod \"cert-manager-cainjector-7d9f95dbf-q25jk\" (UID: \"fcf33643-9678-4f5e-958a-a1df723d0497\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.587030 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fcf33643-9678-4f5e-958a-a1df723d0497-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-q25jk\" (UID: \"fcf33643-9678-4f5e-958a-a1df723d0497\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.688417 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngb9q\" (UniqueName: \"kubernetes.io/projected/fcf33643-9678-4f5e-958a-a1df723d0497-kube-api-access-ngb9q\") pod \"cert-manager-cainjector-7d9f95dbf-q25jk\" (UID: \"fcf33643-9678-4f5e-958a-a1df723d0497\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.688479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fcf33643-9678-4f5e-958a-a1df723d0497-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-q25jk\" (UID: \"fcf33643-9678-4f5e-958a-a1df723d0497\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.708807 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngb9q\" (UniqueName: \"kubernetes.io/projected/fcf33643-9678-4f5e-958a-a1df723d0497-kube-api-access-ngb9q\") pod \"cert-manager-cainjector-7d9f95dbf-q25jk\" (UID: \"fcf33643-9678-4f5e-958a-a1df723d0497\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.719491 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fcf33643-9678-4f5e-958a-a1df723d0497-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-q25jk\" (UID: \"fcf33643-9678-4f5e-958a-a1df723d0497\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" Oct 03 13:08:46 crc kubenswrapper[4962]: I1003 13:08:46.723454 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" Oct 03 13:08:47 crc kubenswrapper[4962]: I1003 13:08:47.198967 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk"] Oct 03 13:08:47 crc kubenswrapper[4962]: I1003 13:08:47.887198 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" event={"ID":"fcf33643-9678-4f5e-958a-a1df723d0497","Type":"ContainerStarted","Data":"fcc4d377976d6d019a0651a076b1010d7caaf1367df9f586fa2ee9c394a88dc7"} Oct 03 13:08:49 crc kubenswrapper[4962]: I1003 13:08:49.899714 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" event={"ID":"fcf33643-9678-4f5e-958a-a1df723d0497","Type":"ContainerStarted","Data":"edcf3ac9f95baec0574cf6c9aae276a6a43cd8b2692cd3691746d43ee6b220ca"} Oct 03 13:08:49 crc kubenswrapper[4962]: I1003 13:08:49.901178 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" event={"ID":"3429db32-6f0c-4b9c-8c3f-fd257b45d362","Type":"ContainerStarted","Data":"31d56e40725137f011bb95d80daf1d7cdbc17c04845b32ae6caab5606beaad2f"} Oct 03 13:08:49 crc kubenswrapper[4962]: I1003 13:08:49.901291 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:08:49 crc kubenswrapper[4962]: I1003 13:08:49.918030 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-q25jk" podStartSLOduration=1.7680770030000001 podStartE2EDuration="3.918009717s" podCreationTimestamp="2025-10-03 13:08:46 +0000 UTC" firstStartedPulling="2025-10-03 13:08:47.210770732 +0000 UTC m=+1135.614668567" lastFinishedPulling="2025-10-03 13:08:49.360703446 +0000 UTC m=+1137.764601281" observedRunningTime="2025-10-03 13:08:49.913557406 +0000 UTC m=+1138.317455241" watchObservedRunningTime="2025-10-03 13:08:49.918009717 +0000 UTC m=+1138.321907552" Oct 03 13:08:49 crc kubenswrapper[4962]: I1003 13:08:49.938296 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" podStartSLOduration=1.919648721 podStartE2EDuration="5.938261543s" podCreationTimestamp="2025-10-03 13:08:44 +0000 UTC" firstStartedPulling="2025-10-03 13:08:45.363432039 +0000 UTC m=+1133.767329874" lastFinishedPulling="2025-10-03 13:08:49.382044861 +0000 UTC m=+1137.785942696" observedRunningTime="2025-10-03 13:08:49.932538628 +0000 UTC m=+1138.336436463" watchObservedRunningTime="2025-10-03 13:08:49.938261543 +0000 UTC m=+1138.342159388" Oct 03 13:08:55 crc kubenswrapper[4962]: I1003 13:08:55.117586 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-sfwbb" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.563175 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pkf2x"] Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.565155 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.566943 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-nwb7j" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.571252 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pkf2x"] Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.715739 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c9743f-647d-44e9-8aa4-5e88b24f6452-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pkf2x\" (UID: \"32c9743f-647d-44e9-8aa4-5e88b24f6452\") " pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.716024 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phjm\" (UniqueName: \"kubernetes.io/projected/32c9743f-647d-44e9-8aa4-5e88b24f6452-kube-api-access-7phjm\") pod \"cert-manager-7d4cc89fcb-pkf2x\" (UID: \"32c9743f-647d-44e9-8aa4-5e88b24f6452\") " pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.817179 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c9743f-647d-44e9-8aa4-5e88b24f6452-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pkf2x\" (UID: \"32c9743f-647d-44e9-8aa4-5e88b24f6452\") " pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.817222 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7phjm\" (UniqueName: \"kubernetes.io/projected/32c9743f-647d-44e9-8aa4-5e88b24f6452-kube-api-access-7phjm\") pod \"cert-manager-7d4cc89fcb-pkf2x\" (UID: \"32c9743f-647d-44e9-8aa4-5e88b24f6452\") " pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.834589 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c9743f-647d-44e9-8aa4-5e88b24f6452-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pkf2x\" (UID: \"32c9743f-647d-44e9-8aa4-5e88b24f6452\") " pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.835624 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phjm\" (UniqueName: \"kubernetes.io/projected/32c9743f-647d-44e9-8aa4-5e88b24f6452-kube-api-access-7phjm\") pod \"cert-manager-7d4cc89fcb-pkf2x\" (UID: \"32c9743f-647d-44e9-8aa4-5e88b24f6452\") " pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" Oct 03 13:09:03 crc kubenswrapper[4962]: I1003 13:09:03.888429 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" Oct 03 13:09:04 crc kubenswrapper[4962]: I1003 13:09:04.269244 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pkf2x"] Oct 03 13:09:04 crc kubenswrapper[4962]: W1003 13:09:04.273721 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32c9743f_647d_44e9_8aa4_5e88b24f6452.slice/crio-22f306a713f8bbd2effbff86eef0714c73f0881500593e4418b2eb41bb5c21c6 WatchSource:0}: Error finding container 22f306a713f8bbd2effbff86eef0714c73f0881500593e4418b2eb41bb5c21c6: Status 404 returned error can't find the container with id 22f306a713f8bbd2effbff86eef0714c73f0881500593e4418b2eb41bb5c21c6 Oct 03 13:09:04 crc kubenswrapper[4962]: I1003 13:09:04.984500 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" event={"ID":"32c9743f-647d-44e9-8aa4-5e88b24f6452","Type":"ContainerStarted","Data":"22f306a713f8bbd2effbff86eef0714c73f0881500593e4418b2eb41bb5c21c6"} Oct 03 13:09:05 crc kubenswrapper[4962]: I1003 13:09:05.991977 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" event={"ID":"32c9743f-647d-44e9-8aa4-5e88b24f6452","Type":"ContainerStarted","Data":"ee504a9656d44d9ff13e3116e43e95d84992845eb5cd220430ee247965400deb"} Oct 03 13:09:06 crc kubenswrapper[4962]: I1003 13:09:06.013853 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-pkf2x" podStartSLOduration=3.013824053 podStartE2EDuration="3.013824053s" podCreationTimestamp="2025-10-03 13:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:09:06.006260419 +0000 UTC m=+1154.410158244" watchObservedRunningTime="2025-10-03 13:09:06.013824053 +0000 UTC m=+1154.417721938" Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.264703 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bsx6q"] Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.265950 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bsx6q" Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.268205 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9qgrw" Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.268339 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.268495 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.280288 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bsx6q"] Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.387441 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vff\" (UniqueName: \"kubernetes.io/projected/6916b4b7-2e8e-4d73-80dc-73c9226869b6-kube-api-access-x7vff\") pod \"openstack-operator-index-bsx6q\" (UID: \"6916b4b7-2e8e-4d73-80dc-73c9226869b6\") " pod="openstack-operators/openstack-operator-index-bsx6q" Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.489101 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vff\" (UniqueName: \"kubernetes.io/projected/6916b4b7-2e8e-4d73-80dc-73c9226869b6-kube-api-access-x7vff\") pod \"openstack-operator-index-bsx6q\" (UID: \"6916b4b7-2e8e-4d73-80dc-73c9226869b6\") " pod="openstack-operators/openstack-operator-index-bsx6q" Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.507091 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vff\" (UniqueName: \"kubernetes.io/projected/6916b4b7-2e8e-4d73-80dc-73c9226869b6-kube-api-access-x7vff\") pod \"openstack-operator-index-bsx6q\" (UID: \"6916b4b7-2e8e-4d73-80dc-73c9226869b6\") " pod="openstack-operators/openstack-operator-index-bsx6q" Oct 03 13:09:09 crc kubenswrapper[4962]: I1003 13:09:09.591232 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bsx6q" Oct 03 13:09:10 crc kubenswrapper[4962]: I1003 13:09:10.015885 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bsx6q"] Oct 03 13:09:11 crc kubenswrapper[4962]: I1003 13:09:11.022964 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bsx6q" event={"ID":"6916b4b7-2e8e-4d73-80dc-73c9226869b6","Type":"ContainerStarted","Data":"f2eed8a97b6ac80dc31c1bc8b287323656027e76ca27c736a1e56c872cbeb4c7"} Oct 03 13:09:11 crc kubenswrapper[4962]: I1003 13:09:11.433597 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bsx6q"] Oct 03 13:09:11 crc kubenswrapper[4962]: I1003 13:09:11.838094 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mbh5p"] Oct 03 13:09:11 crc kubenswrapper[4962]: I1003 13:09:11.839088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:11 crc kubenswrapper[4962]: I1003 13:09:11.848569 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mbh5p"] Oct 03 13:09:11 crc kubenswrapper[4962]: I1003 13:09:11.922928 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8h57\" (UniqueName: \"kubernetes.io/projected/703db32b-fa73-48ac-834d-addaf46d6293-kube-api-access-p8h57\") pod \"openstack-operator-index-mbh5p\" (UID: \"703db32b-fa73-48ac-834d-addaf46d6293\") " pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:12 crc kubenswrapper[4962]: I1003 13:09:12.024722 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8h57\" (UniqueName: \"kubernetes.io/projected/703db32b-fa73-48ac-834d-addaf46d6293-kube-api-access-p8h57\") pod \"openstack-operator-index-mbh5p\" (UID: \"703db32b-fa73-48ac-834d-addaf46d6293\") " pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:12 crc kubenswrapper[4962]: I1003 13:09:12.057496 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8h57\" (UniqueName: \"kubernetes.io/projected/703db32b-fa73-48ac-834d-addaf46d6293-kube-api-access-p8h57\") pod \"openstack-operator-index-mbh5p\" (UID: \"703db32b-fa73-48ac-834d-addaf46d6293\") " pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:12 crc kubenswrapper[4962]: I1003 13:09:12.162027 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:12 crc kubenswrapper[4962]: I1003 13:09:12.606328 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mbh5p"] Oct 03 13:09:12 crc kubenswrapper[4962]: W1003 13:09:12.621495 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703db32b_fa73_48ac_834d_addaf46d6293.slice/crio-f5e6a48992f683cc4d27fc99b404b86a0f32a4e7e0de0a14bfd7faa432fa34d1 WatchSource:0}: Error finding container f5e6a48992f683cc4d27fc99b404b86a0f32a4e7e0de0a14bfd7faa432fa34d1: Status 404 returned error can't find the container with id f5e6a48992f683cc4d27fc99b404b86a0f32a4e7e0de0a14bfd7faa432fa34d1 Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.035903 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mbh5p" event={"ID":"703db32b-fa73-48ac-834d-addaf46d6293","Type":"ContainerStarted","Data":"86edcf8d082b6934041a30f4a93b472bb6e306f9708f86203b903b826b093a25"} Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.035943 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mbh5p" event={"ID":"703db32b-fa73-48ac-834d-addaf46d6293","Type":"ContainerStarted","Data":"f5e6a48992f683cc4d27fc99b404b86a0f32a4e7e0de0a14bfd7faa432fa34d1"} Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.037394 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bsx6q" event={"ID":"6916b4b7-2e8e-4d73-80dc-73c9226869b6","Type":"ContainerStarted","Data":"f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411"} Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.037494 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-bsx6q" podUID="6916b4b7-2e8e-4d73-80dc-73c9226869b6" containerName="registry-server" containerID="cri-o://f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411" gracePeriod=2 Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.064155 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mbh5p" podStartSLOduration=2.016064914 podStartE2EDuration="2.064136351s" podCreationTimestamp="2025-10-03 13:09:11 +0000 UTC" firstStartedPulling="2025-10-03 13:09:12.624922196 +0000 UTC m=+1161.028820031" lastFinishedPulling="2025-10-03 13:09:12.672993633 +0000 UTC m=+1161.076891468" observedRunningTime="2025-10-03 13:09:13.060184294 +0000 UTC m=+1161.464082149" watchObservedRunningTime="2025-10-03 13:09:13.064136351 +0000 UTC m=+1161.468034186" Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.075906 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bsx6q" podStartSLOduration=1.976071635 podStartE2EDuration="4.075884078s" podCreationTimestamp="2025-10-03 13:09:09 +0000 UTC" firstStartedPulling="2025-10-03 13:09:10.027837622 +0000 UTC m=+1158.431735457" lastFinishedPulling="2025-10-03 13:09:12.127650075 +0000 UTC m=+1160.531547900" observedRunningTime="2025-10-03 13:09:13.070912804 +0000 UTC m=+1161.474810639" watchObservedRunningTime="2025-10-03 13:09:13.075884078 +0000 UTC m=+1161.479781933" Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.439795 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bsx6q" Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.551038 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7vff\" (UniqueName: \"kubernetes.io/projected/6916b4b7-2e8e-4d73-80dc-73c9226869b6-kube-api-access-x7vff\") pod \"6916b4b7-2e8e-4d73-80dc-73c9226869b6\" (UID: \"6916b4b7-2e8e-4d73-80dc-73c9226869b6\") " Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.556596 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6916b4b7-2e8e-4d73-80dc-73c9226869b6-kube-api-access-x7vff" (OuterVolumeSpecName: "kube-api-access-x7vff") pod "6916b4b7-2e8e-4d73-80dc-73c9226869b6" (UID: "6916b4b7-2e8e-4d73-80dc-73c9226869b6"). InnerVolumeSpecName "kube-api-access-x7vff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:09:13 crc kubenswrapper[4962]: I1003 13:09:13.652747 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7vff\" (UniqueName: \"kubernetes.io/projected/6916b4b7-2e8e-4d73-80dc-73c9226869b6-kube-api-access-x7vff\") on node \"crc\" DevicePath \"\"" Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.044691 4962 generic.go:334] "Generic (PLEG): container finished" podID="6916b4b7-2e8e-4d73-80dc-73c9226869b6" containerID="f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411" exitCode=0 Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.044753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bsx6q" event={"ID":"6916b4b7-2e8e-4d73-80dc-73c9226869b6","Type":"ContainerDied","Data":"f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411"} Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.044815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bsx6q" event={"ID":"6916b4b7-2e8e-4d73-80dc-73c9226869b6","Type":"ContainerDied","Data":"f2eed8a97b6ac80dc31c1bc8b287323656027e76ca27c736a1e56c872cbeb4c7"} Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.044835 4962 scope.go:117] "RemoveContainer" containerID="f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411" Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.046685 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bsx6q" Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.067006 4962 scope.go:117] "RemoveContainer" containerID="f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411" Oct 03 13:09:14 crc kubenswrapper[4962]: E1003 13:09:14.067718 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411\": container with ID starting with f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411 not found: ID does not exist" containerID="f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411" Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.067875 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411"} err="failed to get container status \"f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411\": rpc error: code = NotFound desc = could not find container \"f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411\": container with ID starting with f169c0aabaa91dae53ebb256226ac9fd73f89b0464c63f9b8e58e79892e2a411 not found: ID does not exist" Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.084531 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bsx6q"] Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.087903 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-bsx6q"] Oct 03 13:09:14 crc kubenswrapper[4962]: I1003 13:09:14.234357 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6916b4b7-2e8e-4d73-80dc-73c9226869b6" path="/var/lib/kubelet/pods/6916b4b7-2e8e-4d73-80dc-73c9226869b6/volumes" Oct 03 13:09:22 crc kubenswrapper[4962]: I1003 13:09:22.165284 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:22 crc kubenswrapper[4962]: I1003 13:09:22.166016 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:22 crc kubenswrapper[4962]: I1003 13:09:22.192261 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:23 crc kubenswrapper[4962]: I1003 13:09:23.113086 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mbh5p" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.471936 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7"] Oct 03 13:09:35 crc kubenswrapper[4962]: E1003 13:09:35.472868 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6916b4b7-2e8e-4d73-80dc-73c9226869b6" containerName="registry-server" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.472884 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6916b4b7-2e8e-4d73-80dc-73c9226869b6" containerName="registry-server" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.473001 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6916b4b7-2e8e-4d73-80dc-73c9226869b6" containerName="registry-server" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.473827 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.476824 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-g6tct" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.489353 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7"] Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.635553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-bundle\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.635593 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-util\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.635623 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xvsz\" (UniqueName: \"kubernetes.io/projected/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-kube-api-access-4xvsz\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.736390 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xvsz\" (UniqueName: \"kubernetes.io/projected/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-kube-api-access-4xvsz\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.736864 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-bundle\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.737022 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-util\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.737316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-bundle\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.737463 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-util\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.757620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xvsz\" (UniqueName: \"kubernetes.io/projected/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-kube-api-access-4xvsz\") pod \"6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:35 crc kubenswrapper[4962]: I1003 13:09:35.840161 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:36 crc kubenswrapper[4962]: I1003 13:09:36.241686 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7"] Oct 03 13:09:37 crc kubenswrapper[4962]: I1003 13:09:37.181060 4962 generic.go:334] "Generic (PLEG): container finished" podID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerID="bc7b219b35fee6b007a2647855899d91d5cda25b2d893df80cd9a4c6c36644b7" exitCode=0 Oct 03 13:09:37 crc kubenswrapper[4962]: I1003 13:09:37.181284 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" event={"ID":"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb","Type":"ContainerDied","Data":"bc7b219b35fee6b007a2647855899d91d5cda25b2d893df80cd9a4c6c36644b7"} Oct 03 13:09:37 crc kubenswrapper[4962]: I1003 13:09:37.181573 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" event={"ID":"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb","Type":"ContainerStarted","Data":"4c8a666b8e3841343f7695d01d7f4d7ed5028f606c67c82dae04de507483bdd2"} Oct 03 13:09:38 crc kubenswrapper[4962]: I1003 13:09:38.197067 4962 generic.go:334] "Generic (PLEG): container finished" podID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerID="9763dfe74f210cc535a9d9b508e46fabed02e54c5188e219e3bb982d173524bc" exitCode=0 Oct 03 13:09:38 crc kubenswrapper[4962]: I1003 13:09:38.197220 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" event={"ID":"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb","Type":"ContainerDied","Data":"9763dfe74f210cc535a9d9b508e46fabed02e54c5188e219e3bb982d173524bc"} Oct 03 13:09:39 crc kubenswrapper[4962]: I1003 13:09:39.204813 4962 generic.go:334] "Generic (PLEG): container finished" podID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerID="15c7d9fa5a4781912aa851f5fd469713f6a1c551f47882fa3098250dafd54b40" exitCode=0 Oct 03 13:09:39 crc kubenswrapper[4962]: I1003 13:09:39.204873 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" event={"ID":"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb","Type":"ContainerDied","Data":"15c7d9fa5a4781912aa851f5fd469713f6a1c551f47882fa3098250dafd54b40"} Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.463033 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.603206 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xvsz\" (UniqueName: \"kubernetes.io/projected/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-kube-api-access-4xvsz\") pod \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.603488 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-bundle\") pod \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.603665 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-util\") pod \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\" (UID: \"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb\") " Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.604125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-bundle" (OuterVolumeSpecName: "bundle") pod "963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" (UID: "963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.609033 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-kube-api-access-4xvsz" (OuterVolumeSpecName: "kube-api-access-4xvsz") pod "963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" (UID: "963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb"). InnerVolumeSpecName "kube-api-access-4xvsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.616476 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-util" (OuterVolumeSpecName: "util") pod "963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" (UID: "963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.705486 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-util\") on node \"crc\" DevicePath \"\"" Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.705524 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xvsz\" (UniqueName: \"kubernetes.io/projected/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-kube-api-access-4xvsz\") on node \"crc\" DevicePath \"\"" Oct 03 13:09:40 crc kubenswrapper[4962]: I1003 13:09:40.705536 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:09:41 crc kubenswrapper[4962]: I1003 13:09:41.218557 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" event={"ID":"963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb","Type":"ContainerDied","Data":"4c8a666b8e3841343f7695d01d7f4d7ed5028f606c67c82dae04de507483bdd2"} Oct 03 13:09:41 crc kubenswrapper[4962]: I1003 13:09:41.218597 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c8a666b8e3841343f7695d01d7f4d7ed5028f606c67c82dae04de507483bdd2" Oct 03 13:09:41 crc kubenswrapper[4962]: I1003 13:09:41.218989 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.551925 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k"] Oct 03 13:09:46 crc kubenswrapper[4962]: E1003 13:09:46.552596 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerName="util" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.552608 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerName="util" Oct 03 13:09:46 crc kubenswrapper[4962]: E1003 13:09:46.552621 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerName="pull" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.552627 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerName="pull" Oct 03 13:09:46 crc kubenswrapper[4962]: E1003 13:09:46.552645 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerName="extract" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.552664 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerName="extract" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.552779 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb" containerName="extract" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.553345 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.556240 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-hptdm" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.563559 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k"] Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.682305 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn6px\" (UniqueName: \"kubernetes.io/projected/1b037b6e-0364-4f71-852b-b10f28b37915-kube-api-access-dn6px\") pod \"openstack-operator-controller-operator-764f84468b-skw2k\" (UID: \"1b037b6e-0364-4f71-852b-b10f28b37915\") " pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.783153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn6px\" (UniqueName: \"kubernetes.io/projected/1b037b6e-0364-4f71-852b-b10f28b37915-kube-api-access-dn6px\") pod \"openstack-operator-controller-operator-764f84468b-skw2k\" (UID: \"1b037b6e-0364-4f71-852b-b10f28b37915\") " pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.804107 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn6px\" (UniqueName: \"kubernetes.io/projected/1b037b6e-0364-4f71-852b-b10f28b37915-kube-api-access-dn6px\") pod \"openstack-operator-controller-operator-764f84468b-skw2k\" (UID: \"1b037b6e-0364-4f71-852b-b10f28b37915\") " pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" Oct 03 13:09:46 crc kubenswrapper[4962]: I1003 13:09:46.882545 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" Oct 03 13:09:47 crc kubenswrapper[4962]: I1003 13:09:47.086323 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k"] Oct 03 13:09:47 crc kubenswrapper[4962]: I1003 13:09:47.260422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" event={"ID":"1b037b6e-0364-4f71-852b-b10f28b37915","Type":"ContainerStarted","Data":"f7a70badece411447af99b781d4a2ca10c050612045d0c015a28d42e5dcf1546"} Oct 03 13:09:51 crc kubenswrapper[4962]: I1003 13:09:51.283568 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" event={"ID":"1b037b6e-0364-4f71-852b-b10f28b37915","Type":"ContainerStarted","Data":"5514e6e4087738fb1a728084feba8426d5b0155c2a1dd1b9deeda2dd6b5da664"} Oct 03 13:09:53 crc kubenswrapper[4962]: I1003 13:09:53.323660 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" event={"ID":"1b037b6e-0364-4f71-852b-b10f28b37915","Type":"ContainerStarted","Data":"f32c18153be028c5cbfd2417eb64ae15859ee55edfbf13dc497656813f89efb4"} Oct 03 13:09:53 crc kubenswrapper[4962]: I1003 13:09:53.325303 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" Oct 03 13:09:53 crc kubenswrapper[4962]: I1003 13:09:53.357913 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" podStartSLOduration=1.641374052 podStartE2EDuration="7.357896497s" podCreationTimestamp="2025-10-03 13:09:46 +0000 UTC" firstStartedPulling="2025-10-03 13:09:47.095451638 +0000 UTC m=+1195.499349473" lastFinishedPulling="2025-10-03 13:09:52.811974083 +0000 UTC m=+1201.215871918" observedRunningTime="2025-10-03 13:09:53.356696865 +0000 UTC m=+1201.760594700" watchObservedRunningTime="2025-10-03 13:09:53.357896497 +0000 UTC m=+1201.761794332" Oct 03 13:09:55 crc kubenswrapper[4962]: I1003 13:09:55.340593 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-764f84468b-skw2k" Oct 03 13:10:24 crc kubenswrapper[4962]: I1003 13:10:24.660324 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:10:24 crc kubenswrapper[4962]: I1003 13:10:24.660954 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.113940 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.116410 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.117564 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.124728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.125669 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5w8xj" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.138676 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ts4qp" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.151693 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.166584 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.175901 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.177256 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.181378 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-55kn6" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.197501 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.198586 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.206150 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2cl\" (UniqueName: \"kubernetes.io/projected/e78401d0-e96e-41ff-b4f6-97d72553280b-kube-api-access-jw2cl\") pod \"cinder-operator-controller-manager-79d68d6c85-hmld4\" (UID: \"e78401d0-e96e-41ff-b4f6-97d72553280b\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.206188 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksvxc\" (UniqueName: \"kubernetes.io/projected/141599a7-33ff-4db6-b265-1f0e3407fdf5-kube-api-access-ksvxc\") pod \"barbican-operator-controller-manager-6c675fb79f-np2sg\" (UID: \"141599a7-33ff-4db6-b265-1f0e3407fdf5\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.206214 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxmj5\" (UniqueName: \"kubernetes.io/projected/9e4441f1-c997-4e2e-b2b9-ff6e05718dfd-kube-api-access-nxmj5\") pod \"designate-operator-controller-manager-75dfd9b554-7gwjc\" (UID: \"9e4441f1-c997-4e2e-b2b9-ff6e05718dfd\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.206281 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbw2z\" (UniqueName: \"kubernetes.io/projected/b7991083-22d5-42c8-8df0-6e93acee716b-kube-api-access-pbw2z\") pod \"glance-operator-controller-manager-846dff85b5-pf6wd\" (UID: \"b7991083-22d5-42c8-8df0-6e93acee716b\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.207144 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6v8qx" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.215016 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.225269 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.253700 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-82zrm"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.254709 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.268056 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-v5xdg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.269452 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.270582 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.277429 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-6vbgl" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.283323 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-82zrm"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.297645 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.311422 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbw2z\" (UniqueName: \"kubernetes.io/projected/b7991083-22d5-42c8-8df0-6e93acee716b-kube-api-access-pbw2z\") pod \"glance-operator-controller-manager-846dff85b5-pf6wd\" (UID: \"b7991083-22d5-42c8-8df0-6e93acee716b\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.311473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2cl\" (UniqueName: \"kubernetes.io/projected/e78401d0-e96e-41ff-b4f6-97d72553280b-kube-api-access-jw2cl\") pod \"cinder-operator-controller-manager-79d68d6c85-hmld4\" (UID: \"e78401d0-e96e-41ff-b4f6-97d72553280b\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.311497 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksvxc\" (UniqueName: \"kubernetes.io/projected/141599a7-33ff-4db6-b265-1f0e3407fdf5-kube-api-access-ksvxc\") pod \"barbican-operator-controller-manager-6c675fb79f-np2sg\" (UID: \"141599a7-33ff-4db6-b265-1f0e3407fdf5\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.311727 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxmj5\" (UniqueName: \"kubernetes.io/projected/9e4441f1-c997-4e2e-b2b9-ff6e05718dfd-kube-api-access-nxmj5\") pod \"designate-operator-controller-manager-75dfd9b554-7gwjc\" (UID: \"9e4441f1-c997-4e2e-b2b9-ff6e05718dfd\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.319995 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.321380 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.324556 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-zkbw7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.333919 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.334908 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.342736 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.344794 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.344907 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-td5lz" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.346251 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksvxc\" (UniqueName: \"kubernetes.io/projected/141599a7-33ff-4db6-b265-1f0e3407fdf5-kube-api-access-ksvxc\") pod \"barbican-operator-controller-manager-6c675fb79f-np2sg\" (UID: \"141599a7-33ff-4db6-b265-1f0e3407fdf5\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.347744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.347860 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbw2z\" (UniqueName: \"kubernetes.io/projected/b7991083-22d5-42c8-8df0-6e93acee716b-kube-api-access-pbw2z\") pod \"glance-operator-controller-manager-846dff85b5-pf6wd\" (UID: \"b7991083-22d5-42c8-8df0-6e93acee716b\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.352350 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxmj5\" (UniqueName: \"kubernetes.io/projected/9e4441f1-c997-4e2e-b2b9-ff6e05718dfd-kube-api-access-nxmj5\") pod \"designate-operator-controller-manager-75dfd9b554-7gwjc\" (UID: \"9e4441f1-c997-4e2e-b2b9-ff6e05718dfd\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.356927 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2cl\" (UniqueName: \"kubernetes.io/projected/e78401d0-e96e-41ff-b4f6-97d72553280b-kube-api-access-jw2cl\") pod \"cinder-operator-controller-manager-79d68d6c85-hmld4\" (UID: \"e78401d0-e96e-41ff-b4f6-97d72553280b\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.372089 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.373290 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.380682 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.381912 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.391358 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.393079 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fwthv" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.393337 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-xjw5h" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.404934 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.413203 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzrmg\" (UniqueName: \"kubernetes.io/projected/df253268-0372-4cff-a9a0-ff9d4d8eac7b-kube-api-access-nzrmg\") pod \"horizon-operator-controller-manager-6769b867d9-6tjz7\" (UID: \"df253268-0372-4cff-a9a0-ff9d4d8eac7b\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.413340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr44m\" (UniqueName: \"kubernetes.io/projected/2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6-kube-api-access-hr44m\") pod \"heat-operator-controller-manager-599898f689-82zrm\" (UID: \"2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.421238 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.424618 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.428760 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.429475 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rwff8" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.442695 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.443917 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.447512 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.468945 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.472517 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vqr5l" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.473667 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.487438 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.488592 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.499427 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5pmmk" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.514721 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.514888 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zhbs\" (UniqueName: \"kubernetes.io/projected/e7bd048c-266e-45eb-8755-1bac673b02cc-kube-api-access-8zhbs\") pod \"infra-operator-controller-manager-5fbf469cd7-n2xlz\" (UID: \"e7bd048c-266e-45eb-8755-1bac673b02cc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.514943 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr44m\" (UniqueName: \"kubernetes.io/projected/2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6-kube-api-access-hr44m\") pod \"heat-operator-controller-manager-599898f689-82zrm\" (UID: \"2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.514965 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bd048c-266e-45eb-8755-1bac673b02cc-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-n2xlz\" (UID: \"e7bd048c-266e-45eb-8755-1bac673b02cc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.514982 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m984x\" (UniqueName: \"kubernetes.io/projected/51054362-5c8b-4866-bb58-c0cff476b726-kube-api-access-m984x\") pod \"ironic-operator-controller-manager-84bc9db6cc-tnwbc\" (UID: \"51054362-5c8b-4866-bb58-c0cff476b726\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.515016 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzrmg\" (UniqueName: \"kubernetes.io/projected/df253268-0372-4cff-a9a0-ff9d4d8eac7b-kube-api-access-nzrmg\") pod \"horizon-operator-controller-manager-6769b867d9-6tjz7\" (UID: \"df253268-0372-4cff-a9a0-ff9d4d8eac7b\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.515068 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cpmr\" (UniqueName: \"kubernetes.io/projected/592a09ab-92d9-447f-9978-e9da27cc4df9-kube-api-access-7cpmr\") pod \"manila-operator-controller-manager-6fd6854b49-9lpjm\" (UID: \"592a09ab-92d9-447f-9978-e9da27cc4df9\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.515089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5h7b\" (UniqueName: \"kubernetes.io/projected/564d1ea9-035c-4ac3-8692-907bf54a2d01-kube-api-access-x5h7b\") pod \"keystone-operator-controller-manager-7f55849f88-nhcsm\" (UID: \"564d1ea9-035c-4ac3-8692-907bf54a2d01\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.515247 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.534229 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.543823 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr44m\" (UniqueName: \"kubernetes.io/projected/2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6-kube-api-access-hr44m\") pod \"heat-operator-controller-manager-599898f689-82zrm\" (UID: \"2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.554284 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzrmg\" (UniqueName: \"kubernetes.io/projected/df253268-0372-4cff-a9a0-ff9d4d8eac7b-kube-api-access-nzrmg\") pod \"horizon-operator-controller-manager-6769b867d9-6tjz7\" (UID: \"df253268-0372-4cff-a9a0-ff9d4d8eac7b\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.572682 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.573736 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.577669 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-djpwk" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.592052 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.608005 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.626497 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.655931 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bd048c-266e-45eb-8755-1bac673b02cc-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-n2xlz\" (UID: \"e7bd048c-266e-45eb-8755-1bac673b02cc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.656016 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m984x\" (UniqueName: \"kubernetes.io/projected/51054362-5c8b-4866-bb58-c0cff476b726-kube-api-access-m984x\") pod \"ironic-operator-controller-manager-84bc9db6cc-tnwbc\" (UID: \"51054362-5c8b-4866-bb58-c0cff476b726\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.656164 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5hpm\" (UniqueName: \"kubernetes.io/projected/a062596b-8e37-4963-bf83-e37ed388bf83-kube-api-access-z5hpm\") pod \"neutron-operator-controller-manager-6574bf987d-7f4r6\" (UID: \"a062596b-8e37-4963-bf83-e37ed388bf83\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.656193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9vvk\" (UniqueName: \"kubernetes.io/projected/fbda4a74-6131-47cb-9098-f23870f67916-kube-api-access-q9vvk\") pod \"mariadb-operator-controller-manager-5c468bf4d4-gm9lt\" (UID: \"fbda4a74-6131-47cb-9098-f23870f67916\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.656245 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cpmr\" (UniqueName: \"kubernetes.io/projected/592a09ab-92d9-447f-9978-e9da27cc4df9-kube-api-access-7cpmr\") pod \"manila-operator-controller-manager-6fd6854b49-9lpjm\" (UID: \"592a09ab-92d9-447f-9978-e9da27cc4df9\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.656282 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5h7b\" (UniqueName: \"kubernetes.io/projected/564d1ea9-035c-4ac3-8692-907bf54a2d01-kube-api-access-x5h7b\") pod \"keystone-operator-controller-manager-7f55849f88-nhcsm\" (UID: \"564d1ea9-035c-4ac3-8692-907bf54a2d01\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.656358 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hsxg\" (UniqueName: \"kubernetes.io/projected/9a84d9fa-a8e7-4280-b500-d4f68500e13e-kube-api-access-4hsxg\") pod \"octavia-operator-controller-manager-59d6cfdf45-rdflf\" (UID: \"9a84d9fa-a8e7-4280-b500-d4f68500e13e\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.656404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zhbs\" (UniqueName: \"kubernetes.io/projected/e7bd048c-266e-45eb-8755-1bac673b02cc-kube-api-access-8zhbs\") pod \"infra-operator-controller-manager-5fbf469cd7-n2xlz\" (UID: \"e7bd048c-266e-45eb-8755-1bac673b02cc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.656438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gnvj\" (UniqueName: \"kubernetes.io/projected/49eb417d-aaa0-4aab-b107-becad30c4185-kube-api-access-6gnvj\") pod \"nova-operator-controller-manager-555c7456bd-r52nf\" (UID: \"49eb417d-aaa0-4aab-b107-becad30c4185\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" Oct 03 13:10:28 crc kubenswrapper[4962]: E1003 13:10:28.656664 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 13:10:28 crc kubenswrapper[4962]: E1003 13:10:28.658462 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bd048c-266e-45eb-8755-1bac673b02cc-cert podName:e7bd048c-266e-45eb-8755-1bac673b02cc nodeName:}" failed. No retries permitted until 2025-10-03 13:10:29.156692745 +0000 UTC m=+1237.560590580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bd048c-266e-45eb-8755-1bac673b02cc-cert") pod "infra-operator-controller-manager-5fbf469cd7-n2xlz" (UID: "e7bd048c-266e-45eb-8755-1bac673b02cc") : secret "infra-operator-webhook-server-cert" not found Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.662786 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.675115 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.679804 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.683387 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.690399 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-x956p" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.694181 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.702219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zhbs\" (UniqueName: \"kubernetes.io/projected/e7bd048c-266e-45eb-8755-1bac673b02cc-kube-api-access-8zhbs\") pod \"infra-operator-controller-manager-5fbf469cd7-n2xlz\" (UID: \"e7bd048c-266e-45eb-8755-1bac673b02cc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.705042 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5h7b\" (UniqueName: \"kubernetes.io/projected/564d1ea9-035c-4ac3-8692-907bf54a2d01-kube-api-access-x5h7b\") pod \"keystone-operator-controller-manager-7f55849f88-nhcsm\" (UID: \"564d1ea9-035c-4ac3-8692-907bf54a2d01\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.706953 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fttmn" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.707331 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m984x\" (UniqueName: \"kubernetes.io/projected/51054362-5c8b-4866-bb58-c0cff476b726-kube-api-access-m984x\") pod \"ironic-operator-controller-manager-84bc9db6cc-tnwbc\" (UID: \"51054362-5c8b-4866-bb58-c0cff476b726\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.710997 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.726388 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.727392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cpmr\" (UniqueName: \"kubernetes.io/projected/592a09ab-92d9-447f-9978-e9da27cc4df9-kube-api-access-7cpmr\") pod \"manila-operator-controller-manager-6fd6854b49-9lpjm\" (UID: \"592a09ab-92d9-447f-9978-e9da27cc4df9\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.733815 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-rdrx9" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.764962 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.765874 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.788277 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hsxg\" (UniqueName: \"kubernetes.io/projected/9a84d9fa-a8e7-4280-b500-d4f68500e13e-kube-api-access-4hsxg\") pod \"octavia-operator-controller-manager-59d6cfdf45-rdflf\" (UID: \"9a84d9fa-a8e7-4280-b500-d4f68500e13e\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.788939 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89sws\" (UniqueName: \"kubernetes.io/projected/7ada00d0-e4a6-46a9-872b-e554866a03c6-kube-api-access-89sws\") pod \"ovn-operator-controller-manager-688db7b6c7-q95dg\" (UID: \"7ada00d0-e4a6-46a9-872b-e554866a03c6\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.789151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gnvj\" (UniqueName: \"kubernetes.io/projected/49eb417d-aaa0-4aab-b107-becad30c4185-kube-api-access-6gnvj\") pod \"nova-operator-controller-manager-555c7456bd-r52nf\" (UID: \"49eb417d-aaa0-4aab-b107-becad30c4185\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.789251 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2lmc\" (UniqueName: \"kubernetes.io/projected/22736612-9f52-42a5-a773-4389ae1473d0-kube-api-access-b2lmc\") pod \"placement-operator-controller-manager-7d8bb7f44c-jq2cl\" (UID: \"22736612-9f52-42a5-a773-4389ae1473d0\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.789393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5hpm\" (UniqueName: \"kubernetes.io/projected/a062596b-8e37-4963-bf83-e37ed388bf83-kube-api-access-z5hpm\") pod \"neutron-operator-controller-manager-6574bf987d-7f4r6\" (UID: \"a062596b-8e37-4963-bf83-e37ed388bf83\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.789445 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.789466 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9vvk\" (UniqueName: \"kubernetes.io/projected/fbda4a74-6131-47cb-9098-f23870f67916-kube-api-access-q9vvk\") pod \"mariadb-operator-controller-manager-5c468bf4d4-gm9lt\" (UID: \"fbda4a74-6131-47cb-9098-f23870f67916\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.791102 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.800852 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.810404 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rvhgh" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.817883 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.822745 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.823509 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gnvj\" (UniqueName: \"kubernetes.io/projected/49eb417d-aaa0-4aab-b107-becad30c4185-kube-api-access-6gnvj\") pod \"nova-operator-controller-manager-555c7456bd-r52nf\" (UID: \"49eb417d-aaa0-4aab-b107-becad30c4185\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.838205 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.842919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9vvk\" (UniqueName: \"kubernetes.io/projected/fbda4a74-6131-47cb-9098-f23870f67916-kube-api-access-q9vvk\") pod \"mariadb-operator-controller-manager-5c468bf4d4-gm9lt\" (UID: \"fbda4a74-6131-47cb-9098-f23870f67916\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.852895 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.858911 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hsxg\" (UniqueName: \"kubernetes.io/projected/9a84d9fa-a8e7-4280-b500-d4f68500e13e-kube-api-access-4hsxg\") pod \"octavia-operator-controller-manager-59d6cfdf45-rdflf\" (UID: \"9a84d9fa-a8e7-4280-b500-d4f68500e13e\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.859489 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5hpm\" (UniqueName: \"kubernetes.io/projected/a062596b-8e37-4963-bf83-e37ed388bf83-kube-api-access-z5hpm\") pod \"neutron-operator-controller-manager-6574bf987d-7f4r6\" (UID: \"a062596b-8e37-4963-bf83-e37ed388bf83\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.866741 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.872004 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-m8t5d" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.872216 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.872871 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.890993 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0193fb9-bb01-4730-a986-7f03c3b61887-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7\" (UID: \"c0193fb9-bb01-4730-a986-7f03c3b61887\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.891055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89sws\" (UniqueName: \"kubernetes.io/projected/7ada00d0-e4a6-46a9-872b-e554866a03c6-kube-api-access-89sws\") pod \"ovn-operator-controller-manager-688db7b6c7-q95dg\" (UID: \"7ada00d0-e4a6-46a9-872b-e554866a03c6\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.891103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2lmc\" (UniqueName: \"kubernetes.io/projected/22736612-9f52-42a5-a773-4389ae1473d0-kube-api-access-b2lmc\") pod \"placement-operator-controller-manager-7d8bb7f44c-jq2cl\" (UID: \"22736612-9f52-42a5-a773-4389ae1473d0\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.891134 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht5dr\" (UniqueName: \"kubernetes.io/projected/c0193fb9-bb01-4730-a986-7f03c3b61887-kube-api-access-ht5dr\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7\" (UID: \"c0193fb9-bb01-4730-a986-7f03c3b61887\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.901773 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.922541 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89sws\" (UniqueName: \"kubernetes.io/projected/7ada00d0-e4a6-46a9-872b-e554866a03c6-kube-api-access-89sws\") pod \"ovn-operator-controller-manager-688db7b6c7-q95dg\" (UID: \"7ada00d0-e4a6-46a9-872b-e554866a03c6\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.927836 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.929142 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.931096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2lmc\" (UniqueName: \"kubernetes.io/projected/22736612-9f52-42a5-a773-4389ae1473d0-kube-api-access-b2lmc\") pod \"placement-operator-controller-manager-7d8bb7f44c-jq2cl\" (UID: \"22736612-9f52-42a5-a773-4389ae1473d0\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.935433 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rdk98" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.952667 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.986812 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg"] Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.989965 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.993511 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6gjgk" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.995920 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6478\" (UniqueName: \"kubernetes.io/projected/da3763da-ed79-40cc-bf74-10729759437e-kube-api-access-c6478\") pod \"swift-operator-controller-manager-6859f9b676-tgrxn\" (UID: \"da3763da-ed79-40cc-bf74-10729759437e\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.996058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht5dr\" (UniqueName: \"kubernetes.io/projected/c0193fb9-bb01-4730-a986-7f03c3b61887-kube-api-access-ht5dr\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7\" (UID: \"c0193fb9-bb01-4730-a986-7f03c3b61887\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.996166 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dv2j\" (UniqueName: \"kubernetes.io/projected/74fa1f6e-ee12-4c7c-97f0-23586ec9983c-kube-api-access-5dv2j\") pod \"telemetry-operator-controller-manager-5db5cf686f-pgf8s\" (UID: \"74fa1f6e-ee12-4c7c-97f0-23586ec9983c\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.996284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8hsc\" (UniqueName: \"kubernetes.io/projected/0f094302-eaab-4160-b549-530131588472-kube-api-access-d8hsc\") pod \"watcher-operator-controller-manager-fcd7d9895-2fvpg\" (UID: \"0f094302-eaab-4160-b549-530131588472\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" Oct 03 13:10:28 crc kubenswrapper[4962]: I1003 13:10:28.996388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0193fb9-bb01-4730-a986-7f03c3b61887-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7\" (UID: \"c0193fb9-bb01-4730-a986-7f03c3b61887\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:28 crc kubenswrapper[4962]: E1003 13:10:28.997555 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 13:10:28 crc kubenswrapper[4962]: E1003 13:10:28.998961 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0193fb9-bb01-4730-a986-7f03c3b61887-cert podName:c0193fb9-bb01-4730-a986-7f03c3b61887 nodeName:}" failed. No retries permitted until 2025-10-03 13:10:29.498934399 +0000 UTC m=+1237.902832234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c0193fb9-bb01-4730-a986-7f03c3b61887-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" (UID: "c0193fb9-bb01-4730-a986-7f03c3b61887") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:28.999214 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.009327 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.009552 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.022891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht5dr\" (UniqueName: \"kubernetes.io/projected/c0193fb9-bb01-4730-a986-7f03c3b61887-kube-api-access-ht5dr\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7\" (UID: \"c0193fb9-bb01-4730-a986-7f03c3b61887\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.042745 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.044907 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.049223 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.049597 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hvbgh" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.081755 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.096557 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.098996 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.101513 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-5b2l6" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.111420 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8hsc\" (UniqueName: \"kubernetes.io/projected/0f094302-eaab-4160-b549-530131588472-kube-api-access-d8hsc\") pod \"watcher-operator-controller-manager-fcd7d9895-2fvpg\" (UID: \"0f094302-eaab-4160-b549-530131588472\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.111470 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztbfd\" (UniqueName: \"kubernetes.io/projected/0708bcfe-3c0e-44d8-b849-fa4464ea3387-kube-api-access-ztbfd\") pod \"test-operator-controller-manager-5cd5cb47d7-2k5gq\" (UID: \"0708bcfe-3c0e-44d8-b849-fa4464ea3387\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.111514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0786c124-4dcc-437d-8f1b-80021feb3553-cert\") pod \"openstack-operator-controller-manager-5c4446bf96-p5f6q\" (UID: \"0786c124-4dcc-437d-8f1b-80021feb3553\") " pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.111587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6478\" (UniqueName: \"kubernetes.io/projected/da3763da-ed79-40cc-bf74-10729759437e-kube-api-access-c6478\") pod \"swift-operator-controller-manager-6859f9b676-tgrxn\" (UID: \"da3763da-ed79-40cc-bf74-10729759437e\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.111645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dv2j\" (UniqueName: \"kubernetes.io/projected/74fa1f6e-ee12-4c7c-97f0-23586ec9983c-kube-api-access-5dv2j\") pod \"telemetry-operator-controller-manager-5db5cf686f-pgf8s\" (UID: \"74fa1f6e-ee12-4c7c-97f0-23586ec9983c\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.111683 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9w5k\" (UniqueName: \"kubernetes.io/projected/0786c124-4dcc-437d-8f1b-80021feb3553-kube-api-access-n9w5k\") pod \"openstack-operator-controller-manager-5c4446bf96-p5f6q\" (UID: \"0786c124-4dcc-437d-8f1b-80021feb3553\") " pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.119220 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.129707 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.137051 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.143459 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6478\" (UniqueName: \"kubernetes.io/projected/da3763da-ed79-40cc-bf74-10729759437e-kube-api-access-c6478\") pod \"swift-operator-controller-manager-6859f9b676-tgrxn\" (UID: \"da3763da-ed79-40cc-bf74-10729759437e\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.153016 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8hsc\" (UniqueName: \"kubernetes.io/projected/0f094302-eaab-4160-b549-530131588472-kube-api-access-d8hsc\") pod \"watcher-operator-controller-manager-fcd7d9895-2fvpg\" (UID: \"0f094302-eaab-4160-b549-530131588472\") " pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.153591 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.155117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.159198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dv2j\" (UniqueName: \"kubernetes.io/projected/74fa1f6e-ee12-4c7c-97f0-23586ec9983c-kube-api-access-5dv2j\") pod \"telemetry-operator-controller-manager-5db5cf686f-pgf8s\" (UID: \"74fa1f6e-ee12-4c7c-97f0-23586ec9983c\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.214450 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bd048c-266e-45eb-8755-1bac673b02cc-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-n2xlz\" (UID: \"e7bd048c-266e-45eb-8755-1bac673b02cc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.214601 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9w5k\" (UniqueName: \"kubernetes.io/projected/0786c124-4dcc-437d-8f1b-80021feb3553-kube-api-access-n9w5k\") pod \"openstack-operator-controller-manager-5c4446bf96-p5f6q\" (UID: \"0786c124-4dcc-437d-8f1b-80021feb3553\") " pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.214739 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztbfd\" (UniqueName: \"kubernetes.io/projected/0708bcfe-3c0e-44d8-b849-fa4464ea3387-kube-api-access-ztbfd\") pod \"test-operator-controller-manager-5cd5cb47d7-2k5gq\" (UID: \"0708bcfe-3c0e-44d8-b849-fa4464ea3387\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.214936 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz5wf\" (UniqueName: \"kubernetes.io/projected/908550fb-fd0f-4363-aaab-3434aae03751-kube-api-access-qz5wf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-j758z\" (UID: \"908550fb-fd0f-4363-aaab-3434aae03751\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.221305 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.221498 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0786c124-4dcc-437d-8f1b-80021feb3553-cert\") pod \"openstack-operator-controller-manager-5c4446bf96-p5f6q\" (UID: \"0786c124-4dcc-437d-8f1b-80021feb3553\") " pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:29 crc kubenswrapper[4962]: E1003 13:10:29.221775 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 13:10:29 crc kubenswrapper[4962]: E1003 13:10:29.221861 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0786c124-4dcc-437d-8f1b-80021feb3553-cert podName:0786c124-4dcc-437d-8f1b-80021feb3553 nodeName:}" failed. No retries permitted until 2025-10-03 13:10:29.72184211 +0000 UTC m=+1238.125740035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0786c124-4dcc-437d-8f1b-80021feb3553-cert") pod "openstack-operator-controller-manager-5c4446bf96-p5f6q" (UID: "0786c124-4dcc-437d-8f1b-80021feb3553") : secret "webhook-server-cert" not found Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.237348 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bd048c-266e-45eb-8755-1bac673b02cc-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-n2xlz\" (UID: \"e7bd048c-266e-45eb-8755-1bac673b02cc\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.258459 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztbfd\" (UniqueName: \"kubernetes.io/projected/0708bcfe-3c0e-44d8-b849-fa4464ea3387-kube-api-access-ztbfd\") pod \"test-operator-controller-manager-5cd5cb47d7-2k5gq\" (UID: \"0708bcfe-3c0e-44d8-b849-fa4464ea3387\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.262306 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9w5k\" (UniqueName: \"kubernetes.io/projected/0786c124-4dcc-437d-8f1b-80021feb3553-kube-api-access-n9w5k\") pod \"openstack-operator-controller-manager-5c4446bf96-p5f6q\" (UID: \"0786c124-4dcc-437d-8f1b-80021feb3553\") " pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.267447 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.297484 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.321188 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.325225 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz5wf\" (UniqueName: \"kubernetes.io/projected/908550fb-fd0f-4363-aaab-3434aae03751-kube-api-access-qz5wf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-j758z\" (UID: \"908550fb-fd0f-4363-aaab-3434aae03751\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" Oct 03 13:10:29 crc kubenswrapper[4962]: W1003 13:10:29.341539 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e4441f1_c997_4e2e_b2b9_ff6e05718dfd.slice/crio-a02156c13037ead4f69cb4f140a98f016a66ae078caf1a82033c62c2ad2218f8 WatchSource:0}: Error finding container a02156c13037ead4f69cb4f140a98f016a66ae078caf1a82033c62c2ad2218f8: Status 404 returned error can't find the container with id a02156c13037ead4f69cb4f140a98f016a66ae078caf1a82033c62c2ad2218f8 Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.347873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz5wf\" (UniqueName: \"kubernetes.io/projected/908550fb-fd0f-4363-aaab-3434aae03751-kube-api-access-qz5wf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-j758z\" (UID: \"908550fb-fd0f-4363-aaab-3434aae03751\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.356328 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.381679 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.429039 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.534155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0193fb9-bb01-4730-a986-7f03c3b61887-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7\" (UID: \"c0193fb9-bb01-4730-a986-7f03c3b61887\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.538233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0193fb9-bb01-4730-a986-7f03c3b61887-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7\" (UID: \"c0193fb9-bb01-4730-a986-7f03c3b61887\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.556353 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.589402 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.590752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" event={"ID":"9e4441f1-c997-4e2e-b2b9-ff6e05718dfd","Type":"ContainerStarted","Data":"a02156c13037ead4f69cb4f140a98f016a66ae078caf1a82033c62c2ad2218f8"} Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.598530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" event={"ID":"141599a7-33ff-4db6-b265-1f0e3407fdf5","Type":"ContainerStarted","Data":"5858208625a271528d1fc81982136523017deda49e5ecb8e13abfd1f4eb05f7e"} Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.680971 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.721126 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.733089 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.742721 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0786c124-4dcc-437d-8f1b-80021feb3553-cert\") pod \"openstack-operator-controller-manager-5c4446bf96-p5f6q\" (UID: \"0786c124-4dcc-437d-8f1b-80021feb3553\") " pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.752706 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0786c124-4dcc-437d-8f1b-80021feb3553-cert\") pod \"openstack-operator-controller-manager-5c4446bf96-p5f6q\" (UID: \"0786c124-4dcc-437d-8f1b-80021feb3553\") " pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.759836 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-82zrm"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.768173 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc"] Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.862274 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf"] Oct 03 13:10:29 crc kubenswrapper[4962]: W1003 13:10:29.867933 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49eb417d_aaa0_4aab_b107_becad30c4185.slice/crio-52caf6705c4766fa047bd833ff4cc7eb7f5c9e12deafc4c6abb51351d8613965 WatchSource:0}: Error finding container 52caf6705c4766fa047bd833ff4cc7eb7f5c9e12deafc4c6abb51351d8613965: Status 404 returned error can't find the container with id 52caf6705c4766fa047bd833ff4cc7eb7f5c9e12deafc4c6abb51351d8613965 Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.868624 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm"] Oct 03 13:10:29 crc kubenswrapper[4962]: W1003 13:10:29.875514 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a84d9fa_a8e7_4280_b500_d4f68500e13e.slice/crio-62ab4c779af435c2da0da11c33a90b0ca085d4d646fda28c538ce2cf44964fb5 WatchSource:0}: Error finding container 62ab4c779af435c2da0da11c33a90b0ca085d4d646fda28c538ce2cf44964fb5: Status 404 returned error can't find the container with id 62ab4c779af435c2da0da11c33a90b0ca085d4d646fda28c538ce2cf44964fb5 Oct 03 13:10:29 crc kubenswrapper[4962]: I1003 13:10:29.875863 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf"] Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.005534 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.052716 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl"] Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.057495 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt"] Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.062208 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn"] Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.066240 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg"] Oct 03 13:10:30 crc kubenswrapper[4962]: W1003 13:10:30.070657 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22736612_9f52_42a5_a773_4389ae1473d0.slice/crio-67f98e3117b52f5cfc15baa622a204913d4b2b4674977e721139984b4055b7b4 WatchSource:0}: Error finding container 67f98e3117b52f5cfc15baa622a204913d4b2b4674977e721139984b4055b7b4: Status 404 returned error can't find the container with id 67f98e3117b52f5cfc15baa622a204913d4b2b4674977e721139984b4055b7b4 Oct 03 13:10:30 crc kubenswrapper[4962]: W1003 13:10:30.075115 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3763da_ed79_40cc_bf74_10729759437e.slice/crio-edd05314e4abba1fb2ff3adc417d0b1ab60f1bd43b9a3315c5993d1b56c7cec8 WatchSource:0}: Error finding container edd05314e4abba1fb2ff3adc417d0b1ab60f1bd43b9a3315c5993d1b56c7cec8: Status 404 returned error can't find the container with id edd05314e4abba1fb2ff3adc417d0b1ab60f1bd43b9a3315c5993d1b56c7cec8 Oct 03 13:10:30 crc kubenswrapper[4962]: W1003 13:10:30.077027 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ada00d0_e4a6_46a9_872b_e554866a03c6.slice/crio-a0bb1803ca0476337c60e3299d9a970a15265a61eb23e9b6c7afef8ac4c269b1 WatchSource:0}: Error finding container a0bb1803ca0476337c60e3299d9a970a15265a61eb23e9b6c7afef8ac4c269b1: Status 404 returned error can't find the container with id a0bb1803ca0476337c60e3299d9a970a15265a61eb23e9b6c7afef8ac4c269b1 Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.079746 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6"] Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.082194 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-89sws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-688db7b6c7-q95dg_openstack-operators(7ada00d0-e4a6-46a9-872b-e554866a03c6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.082654 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q9vvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5c468bf4d4-gm9lt_openstack-operators(fbda4a74-6131-47cb-9098-f23870f67916): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 13:10:30 crc kubenswrapper[4962]: W1003 13:10:30.083952 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda062596b_8e37_4963_bf83_e37ed388bf83.slice/crio-18733d02b5d2ab2e3d35a6306f7f6a1f542b3f86b6a5db74892d60f5be7aa934 WatchSource:0}: Error finding container 18733d02b5d2ab2e3d35a6306f7f6a1f542b3f86b6a5db74892d60f5be7aa934: Status 404 returned error can't find the container with id 18733d02b5d2ab2e3d35a6306f7f6a1f542b3f86b6a5db74892d60f5be7aa934 Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.086735 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z5hpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6574bf987d-7f4r6_openstack-operators(a062596b-8e37-4963-bf83-e37ed388bf83): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.182165 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s"] Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.193507 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz"] Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.197923 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg"] Oct 03 13:10:30 crc kubenswrapper[4962]: W1003 13:10:30.202162 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74fa1f6e_ee12_4c7c_97f0_23586ec9983c.slice/crio-e5792662600feb065c2f21e97caf36845510304a248afb79928e11c4d7d652f1 WatchSource:0}: Error finding container e5792662600feb065c2f21e97caf36845510304a248afb79928e11c4d7d652f1: Status 404 returned error can't find the container with id e5792662600feb065c2f21e97caf36845510304a248afb79928e11c4d7d652f1 Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.207774 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5dv2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5db5cf686f-pgf8s_openstack-operators(74fa1f6e-ee12-4c7c-97f0-23586ec9983c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.211949 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8hsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-fcd7d9895-2fvpg_openstack-operators(0f094302-eaab-4160-b549-530131588472): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.308920 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z"] Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.308973 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qz5wf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-j758z_openstack-operators(908550fb-fd0f-4363-aaab-3434aae03751): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.309244 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" podUID="a062596b-8e37-4963-bf83-e37ed388bf83" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.309351 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" podUID="fbda4a74-6131-47cb-9098-f23870f67916" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.310379 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" podUID="908550fb-fd0f-4363-aaab-3434aae03751" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.310597 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" podUID="7ada00d0-e4a6-46a9-872b-e554866a03c6" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.322764 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7"] Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.328740 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq"] Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.339059 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ztbfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-2k5gq_openstack-operators(0708bcfe-3c0e-44d8-b849-fa4464ea3387): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.349727 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ht5dr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7_openstack-operators(c0193fb9-bb01-4730-a986-7f03c3b61887): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.460284 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" podUID="0f094302-eaab-4160-b549-530131588472" Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.465834 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" podUID="74fa1f6e-ee12-4c7c-97f0-23586ec9983c" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.512302 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q"] Oct 03 13:10:30 crc kubenswrapper[4962]: W1003 13:10:30.552479 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0786c124_4dcc_437d_8f1b_80021feb3553.slice/crio-910cc6b2b11685a3b0dc9438a1d93ebc7f6e76d60292c8d1b2a0568d05051b96 WatchSource:0}: Error finding container 910cc6b2b11685a3b0dc9438a1d93ebc7f6e76d60292c8d1b2a0568d05051b96: Status 404 returned error can't find the container with id 910cc6b2b11685a3b0dc9438a1d93ebc7f6e76d60292c8d1b2a0568d05051b96 Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.609342 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" event={"ID":"0f094302-eaab-4160-b549-530131588472","Type":"ContainerStarted","Data":"7e3bd1bf6486057ecf171e11e4618332c3d8d6d8e528a6f58df4bb84e5640c32"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.609395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" event={"ID":"0f094302-eaab-4160-b549-530131588472","Type":"ContainerStarted","Data":"250084ba7791a5a902579b7d0b5c4e21ac829e3a1b4090ce1929c1013419eef8"} Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.611113 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" podUID="0f094302-eaab-4160-b549-530131588472" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.615481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" event={"ID":"b7991083-22d5-42c8-8df0-6e93acee716b","Type":"ContainerStarted","Data":"ff47d5719598e27322f7923c906a67f4fcf912f72ff50bb779ebe114b5b5545c"} Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.616598 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" podUID="0708bcfe-3c0e-44d8-b849-fa4464ea3387" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.619971 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" event={"ID":"0708bcfe-3c0e-44d8-b849-fa4464ea3387","Type":"ContainerStarted","Data":"2bd71e7801f15a8aa22184efff33ba0b99f989456761d018c2bb11e30ee37c93"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.622327 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" event={"ID":"0786c124-4dcc-437d-8f1b-80021feb3553","Type":"ContainerStarted","Data":"910cc6b2b11685a3b0dc9438a1d93ebc7f6e76d60292c8d1b2a0568d05051b96"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.625248 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" event={"ID":"c0193fb9-bb01-4730-a986-7f03c3b61887","Type":"ContainerStarted","Data":"b7c56377e91575f06eb37eedb92fc159d592f8adecddb4a1c3381588a0cac1e3"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.631804 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" event={"ID":"51054362-5c8b-4866-bb58-c0cff476b726","Type":"ContainerStarted","Data":"710c496c4393e7f828f4314848c77b47adb75472a3fa5e020b52b66b340a6cec"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.636239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" event={"ID":"592a09ab-92d9-447f-9978-e9da27cc4df9","Type":"ContainerStarted","Data":"3c8f01f92ed36f80c0473016edda033ec7c472885f8f179ba91f134b4da523f5"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.639564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" event={"ID":"e78401d0-e96e-41ff-b4f6-97d72553280b","Type":"ContainerStarted","Data":"1752f40c779a09237ef62ffd009d045e68d9f20da7f53d222840d29c5d31e11a"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.641094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" event={"ID":"da3763da-ed79-40cc-bf74-10729759437e","Type":"ContainerStarted","Data":"edd05314e4abba1fb2ff3adc417d0b1ab60f1bd43b9a3315c5993d1b56c7cec8"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.643391 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" event={"ID":"fbda4a74-6131-47cb-9098-f23870f67916","Type":"ContainerStarted","Data":"67bd1ab8e25656ddbf2df9db377991a57a635e5c6e179c16cf1a2f2a0246bbfe"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.643424 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" event={"ID":"fbda4a74-6131-47cb-9098-f23870f67916","Type":"ContainerStarted","Data":"e01e6c9d325941ae097c4e9779c4d3c4b2ca1b9deda65f0324514c410374a86a"} Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.645227 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" podUID="fbda4a74-6131-47cb-9098-f23870f67916" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.646362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" event={"ID":"22736612-9f52-42a5-a773-4389ae1473d0","Type":"ContainerStarted","Data":"67f98e3117b52f5cfc15baa622a204913d4b2b4674977e721139984b4055b7b4"} Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.659617 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" podUID="c0193fb9-bb01-4730-a986-7f03c3b61887" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.682885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" event={"ID":"7ada00d0-e4a6-46a9-872b-e554866a03c6","Type":"ContainerStarted","Data":"717d185350312f85593995c9b60709a4053b66ef661b8b5a77944808b1e75f0c"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.682932 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" event={"ID":"7ada00d0-e4a6-46a9-872b-e554866a03c6","Type":"ContainerStarted","Data":"a0bb1803ca0476337c60e3299d9a970a15265a61eb23e9b6c7afef8ac4c269b1"} Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.687796 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" podUID="7ada00d0-e4a6-46a9-872b-e554866a03c6" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.697363 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" event={"ID":"9a84d9fa-a8e7-4280-b500-d4f68500e13e","Type":"ContainerStarted","Data":"62ab4c779af435c2da0da11c33a90b0ca085d4d646fda28c538ce2cf44964fb5"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.730898 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" event={"ID":"564d1ea9-035c-4ac3-8692-907bf54a2d01","Type":"ContainerStarted","Data":"751c5f77e0ca5bea6c16fcbdfc10d6ce905f86f204a5d56cf07d29b557528891"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.738077 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" event={"ID":"df253268-0372-4cff-a9a0-ff9d4d8eac7b","Type":"ContainerStarted","Data":"da3bc4cb3ab6bc543697a1e4f91302b106caad8a045c35b0b16bb14d3a6afda1"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.742714 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" event={"ID":"e7bd048c-266e-45eb-8755-1bac673b02cc","Type":"ContainerStarted","Data":"b61bc207c45632489c042173f5a31b127c8fb1b6880ffe7311107272f23d9fed"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.745725 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" event={"ID":"49eb417d-aaa0-4aab-b107-becad30c4185","Type":"ContainerStarted","Data":"52caf6705c4766fa047bd833ff4cc7eb7f5c9e12deafc4c6abb51351d8613965"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.748476 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" event={"ID":"a062596b-8e37-4963-bf83-e37ed388bf83","Type":"ContainerStarted","Data":"4ea7b3699e3c064d46d2d6d1d709b87bf335599855754b6f7476623cbb5782a3"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.748543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" event={"ID":"a062596b-8e37-4963-bf83-e37ed388bf83","Type":"ContainerStarted","Data":"18733d02b5d2ab2e3d35a6306f7f6a1f542b3f86b6a5db74892d60f5be7aa934"} Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.749844 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" podUID="a062596b-8e37-4963-bf83-e37ed388bf83" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.762485 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" event={"ID":"908550fb-fd0f-4363-aaab-3434aae03751","Type":"ContainerStarted","Data":"a87828088fcfbd31c2ee0473f8ba89be9ea2f1a8e8780e27ca9b501e4b30da69"} Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.779105 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" podUID="908550fb-fd0f-4363-aaab-3434aae03751" Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.797401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" event={"ID":"2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6","Type":"ContainerStarted","Data":"98246a02ba7d75af3a1dab52a850bd6b04cfa086c744cc04ed21eec69e5b3771"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.809426 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" event={"ID":"74fa1f6e-ee12-4c7c-97f0-23586ec9983c","Type":"ContainerStarted","Data":"81a4a576e05d5e4ead43fe860fca38f494f28575ea433f322b3d2c4e6b4566a0"} Oct 03 13:10:30 crc kubenswrapper[4962]: I1003 13:10:30.809479 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" event={"ID":"74fa1f6e-ee12-4c7c-97f0-23586ec9983c","Type":"ContainerStarted","Data":"e5792662600feb065c2f21e97caf36845510304a248afb79928e11c4d7d652f1"} Oct 03 13:10:30 crc kubenswrapper[4962]: E1003 13:10:30.814297 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" podUID="74fa1f6e-ee12-4c7c-97f0-23586ec9983c" Oct 03 13:10:31 crc kubenswrapper[4962]: I1003 13:10:31.827325 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" event={"ID":"0708bcfe-3c0e-44d8-b849-fa4464ea3387","Type":"ContainerStarted","Data":"406da126ae7f6b1bb978fd08b3a2ef31000fc5a65bfb022c3fac163bc0cd3d73"} Oct 03 13:10:31 crc kubenswrapper[4962]: E1003 13:10:31.829146 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" podUID="0708bcfe-3c0e-44d8-b849-fa4464ea3387" Oct 03 13:10:31 crc kubenswrapper[4962]: I1003 13:10:31.839874 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" event={"ID":"0786c124-4dcc-437d-8f1b-80021feb3553","Type":"ContainerStarted","Data":"86ad2137f9b1a5cdfe2e290e2639a87c765c84657b0ef706f9968bdd7bdbac8c"} Oct 03 13:10:31 crc kubenswrapper[4962]: I1003 13:10:31.839990 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" event={"ID":"0786c124-4dcc-437d-8f1b-80021feb3553","Type":"ContainerStarted","Data":"c5b759c2f0ac333fedc05f14d5a417ab0f6ebaa4a0b5a19db5ce378b2531d82e"} Oct 03 13:10:31 crc kubenswrapper[4962]: I1003 13:10:31.840008 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:31 crc kubenswrapper[4962]: I1003 13:10:31.846260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" event={"ID":"c0193fb9-bb01-4730-a986-7f03c3b61887","Type":"ContainerStarted","Data":"fb0b2124bdd023192092b069b1594af19c1ca8bca2c297eb0ddba29c55f16230"} Oct 03 13:10:31 crc kubenswrapper[4962]: E1003 13:10:31.871822 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" podUID="fbda4a74-6131-47cb-9098-f23870f67916" Oct 03 13:10:31 crc kubenswrapper[4962]: E1003 13:10:31.874137 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" podUID="7ada00d0-e4a6-46a9-872b-e554866a03c6" Oct 03 13:10:31 crc kubenswrapper[4962]: E1003 13:10:31.874529 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" podUID="908550fb-fd0f-4363-aaab-3434aae03751" Oct 03 13:10:31 crc kubenswrapper[4962]: E1003 13:10:31.874592 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" podUID="a062596b-8e37-4963-bf83-e37ed388bf83" Oct 03 13:10:31 crc kubenswrapper[4962]: E1003 13:10:31.875042 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" podUID="0f094302-eaab-4160-b549-530131588472" Oct 03 13:10:31 crc kubenswrapper[4962]: E1003 13:10:31.875359 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" podUID="74fa1f6e-ee12-4c7c-97f0-23586ec9983c" Oct 03 13:10:31 crc kubenswrapper[4962]: E1003 13:10:31.878561 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" podUID="c0193fb9-bb01-4730-a986-7f03c3b61887" Oct 03 13:10:31 crc kubenswrapper[4962]: I1003 13:10:31.945264 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" podStartSLOduration=3.945197275 podStartE2EDuration="3.945197275s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:10:31.944929888 +0000 UTC m=+1240.348827793" watchObservedRunningTime="2025-10-03 13:10:31.945197275 +0000 UTC m=+1240.349095110" Oct 03 13:10:32 crc kubenswrapper[4962]: E1003 13:10:32.852824 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" podUID="c0193fb9-bb01-4730-a986-7f03c3b61887" Oct 03 13:10:32 crc kubenswrapper[4962]: E1003 13:10:32.852953 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" podUID="0708bcfe-3c0e-44d8-b849-fa4464ea3387" Oct 03 13:10:39 crc kubenswrapper[4962]: I1003 13:10:39.916381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" event={"ID":"b7991083-22d5-42c8-8df0-6e93acee716b","Type":"ContainerStarted","Data":"06e1527c91c1b788f67a3076620e4226063ac1e452aa9aa9098bf1264fd6c726"} Oct 03 13:10:39 crc kubenswrapper[4962]: I1003 13:10:39.924789 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" event={"ID":"df253268-0372-4cff-a9a0-ff9d4d8eac7b","Type":"ContainerStarted","Data":"cfb35b1cbfb412b7027c99f8847ff9599e2aaf69b24a5d2165625e5df3425cc0"} Oct 03 13:10:39 crc kubenswrapper[4962]: I1003 13:10:39.939810 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" event={"ID":"49eb417d-aaa0-4aab-b107-becad30c4185","Type":"ContainerStarted","Data":"cac7e033039f005b8dee27f127fe6e1709d2d368f5f96b2a13df9bc729d2bbad"} Oct 03 13:10:39 crc kubenswrapper[4962]: I1003 13:10:39.953610 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" event={"ID":"592a09ab-92d9-447f-9978-e9da27cc4df9","Type":"ContainerStarted","Data":"9164baaa37ab28cf24112f1d351c23322b390fd6d4156b8e4f5a878d25ec7b95"} Oct 03 13:10:39 crc kubenswrapper[4962]: I1003 13:10:39.953671 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" event={"ID":"592a09ab-92d9-447f-9978-e9da27cc4df9","Type":"ContainerStarted","Data":"455a9bcfcc8b52529ef76ad8d7963d4f0c94a8d951c113ccae63827c5556a991"} Oct 03 13:10:39 crc kubenswrapper[4962]: I1003 13:10:39.954593 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" Oct 03 13:10:39 crc kubenswrapper[4962]: I1003 13:10:39.974907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" event={"ID":"564d1ea9-035c-4ac3-8692-907bf54a2d01","Type":"ContainerStarted","Data":"735553418fb1a091a3cc6a07db97e79da29a435be7dbc7360bf52da146902c4f"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.002962 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" event={"ID":"9a84d9fa-a8e7-4280-b500-d4f68500e13e","Type":"ContainerStarted","Data":"077ec61833cf9e38cdabe4506dd609bb17d8801e286f37fcfc1240bc5556fa22"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.003752 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.016410 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5c4446bf96-p5f6q" Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.030142 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" event={"ID":"2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6","Type":"ContainerStarted","Data":"0f78d2ca6d83991d0fd27f68126170a217ff902397c01fd5d8352267e572e86e"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.042138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" event={"ID":"e78401d0-e96e-41ff-b4f6-97d72553280b","Type":"ContainerStarted","Data":"496758d136c01fa9b7d7409cd6c2a07d4a7c61bb05837bc8546fbd7a1cc922c6"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.064040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" event={"ID":"e7bd048c-266e-45eb-8755-1bac673b02cc","Type":"ContainerStarted","Data":"1a4ed886ecafdeb1c838aa266353d50c55da4d56831afc3703586bfeb44d35e1"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.064089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" event={"ID":"e7bd048c-266e-45eb-8755-1bac673b02cc","Type":"ContainerStarted","Data":"d7f839c71ee4c2e40aab1b02193a8b26f0f9ef3d0ab64a5c37160f9fc87335f1"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.064693 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.088376 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" podStartSLOduration=3.07317123 podStartE2EDuration="12.08836019s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.878474857 +0000 UTC m=+1238.282372692" lastFinishedPulling="2025-10-03 13:10:38.893663817 +0000 UTC m=+1247.297561652" observedRunningTime="2025-10-03 13:10:39.980938138 +0000 UTC m=+1248.384835973" watchObservedRunningTime="2025-10-03 13:10:40.08836019 +0000 UTC m=+1248.492258025" Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.088957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" event={"ID":"9e4441f1-c997-4e2e-b2b9-ff6e05718dfd","Type":"ContainerStarted","Data":"7fecf93cf8f8739360e63bf794ea336e670d290774c709e793ec135642eeeae4"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.089005 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" event={"ID":"9e4441f1-c997-4e2e-b2b9-ff6e05718dfd","Type":"ContainerStarted","Data":"d5494bd077fe7e91f6dc1a2a74441d7083a4af5f14953bdb52c34d64cfccc1ea"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.089658 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.119517 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" event={"ID":"141599a7-33ff-4db6-b265-1f0e3407fdf5","Type":"ContainerStarted","Data":"c0b4994958fa311ed0332a92e72c938bd023c67c67da87568acec6cd80ba4efc"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.145949 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" event={"ID":"51054362-5c8b-4866-bb58-c0cff476b726","Type":"ContainerStarted","Data":"f1bc445ae82aa4c3cf274e801104199bad9abbbfd865a799eb86750445e65c6c"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.149054 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" podStartSLOduration=3.118293895 podStartE2EDuration="12.149041953s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.881111228 +0000 UTC m=+1238.285009063" lastFinishedPulling="2025-10-03 13:10:38.911859286 +0000 UTC m=+1247.315757121" observedRunningTime="2025-10-03 13:10:40.090014454 +0000 UTC m=+1248.493912289" watchObservedRunningTime="2025-10-03 13:10:40.149041953 +0000 UTC m=+1248.552939788" Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.150426 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" podStartSLOduration=2.646026171 podStartE2EDuration="12.15041841s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.358232292 +0000 UTC m=+1237.762130117" lastFinishedPulling="2025-10-03 13:10:38.862624521 +0000 UTC m=+1247.266522356" observedRunningTime="2025-10-03 13:10:40.14633165 +0000 UTC m=+1248.550229485" watchObservedRunningTime="2025-10-03 13:10:40.15041841 +0000 UTC m=+1248.554316245" Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.164954 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" event={"ID":"da3763da-ed79-40cc-bf74-10729759437e","Type":"ContainerStarted","Data":"b933e05173a735a53848698d207269b49b1566b81fc168e2767e33c4c620aa4e"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.187021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" event={"ID":"22736612-9f52-42a5-a773-4389ae1473d0","Type":"ContainerStarted","Data":"2b40870c634ea0c5d80ce72b3481b05c7664bdda19d1053e8f0d93cd0e41878f"} Oct 03 13:10:40 crc kubenswrapper[4962]: I1003 13:10:40.221420 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" podStartSLOduration=3.534921421 podStartE2EDuration="12.221398381s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.202898351 +0000 UTC m=+1238.606796186" lastFinishedPulling="2025-10-03 13:10:38.889375311 +0000 UTC m=+1247.293273146" observedRunningTime="2025-10-03 13:10:40.217956729 +0000 UTC m=+1248.621854584" watchObservedRunningTime="2025-10-03 13:10:40.221398381 +0000 UTC m=+1248.625296216" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.194164 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" event={"ID":"9a84d9fa-a8e7-4280-b500-d4f68500e13e","Type":"ContainerStarted","Data":"bc44bc8ae41e1529c0cd6c3a33f6fdfefa8a1452d17a3837f898e4aed701e3eb"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.196728 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" event={"ID":"564d1ea9-035c-4ac3-8692-907bf54a2d01","Type":"ContainerStarted","Data":"5a6ccef5386716adc1e41ec825b79afdb31873e53bce765afcbddfb8748b25b3"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.197470 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.199365 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" event={"ID":"b7991083-22d5-42c8-8df0-6e93acee716b","Type":"ContainerStarted","Data":"a4b01e2608b32452aeafe938394368bf33dd5cbefd5088e3a8e16ea6dd5ee49e"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.199884 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.201660 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" event={"ID":"2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6","Type":"ContainerStarted","Data":"6991300c464ff700d94c19658e6ff25a670e4deea2da85d9f67934315366fc31"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.202015 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.203740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" event={"ID":"e78401d0-e96e-41ff-b4f6-97d72553280b","Type":"ContainerStarted","Data":"aabe15aec16fe2b85d9b6bdaf849552a21d7fe0ab2a990579d89a7996ec0ce16"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.204172 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.205823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" event={"ID":"49eb417d-aaa0-4aab-b107-becad30c4185","Type":"ContainerStarted","Data":"dcf0c73c83bdcf48a3954664fafedfd5044bee0885b9d2796e3e48527e0eb0fd"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.206207 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.208267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" event={"ID":"141599a7-33ff-4db6-b265-1f0e3407fdf5","Type":"ContainerStarted","Data":"9dc84a0ee003fc5e49fb2a4f89302fae84dd17fd4c0085dc759384c1e9327365"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.208663 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.212021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" event={"ID":"df253268-0372-4cff-a9a0-ff9d4d8eac7b","Type":"ContainerStarted","Data":"eb8933ad33da5215c3ffebb480be7f7f63315b5cf54fcdfdfcd0545ccff5a881"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.212718 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.213481 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" podStartSLOduration=4.041655443 podStartE2EDuration="13.213462979s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.73967125 +0000 UTC m=+1238.143569085" lastFinishedPulling="2025-10-03 13:10:38.911478786 +0000 UTC m=+1247.315376621" observedRunningTime="2025-10-03 13:10:41.211718452 +0000 UTC m=+1249.615616297" watchObservedRunningTime="2025-10-03 13:10:41.213462979 +0000 UTC m=+1249.617360814" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.214487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" event={"ID":"51054362-5c8b-4866-bb58-c0cff476b726","Type":"ContainerStarted","Data":"a880b538677edd93a7aa8954dbcbdb95b7626593967f1b32d6e81aae7ef5fb4b"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.214886 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.217343 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" event={"ID":"da3763da-ed79-40cc-bf74-10729759437e","Type":"ContainerStarted","Data":"aae81375665a4a6e5490042621085fdaba6f97c2ed367a545e7209f029de77b9"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.217498 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.220013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" event={"ID":"22736612-9f52-42a5-a773-4389ae1473d0","Type":"ContainerStarted","Data":"ca65483062cd8b911386276d1838e12afe3d9b06d658d2e10a382369bebbc75f"} Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.220048 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.229328 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" podStartSLOduration=4.062026132 podStartE2EDuration="13.229312346s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.747606844 +0000 UTC m=+1238.151504679" lastFinishedPulling="2025-10-03 13:10:38.914893068 +0000 UTC m=+1247.318790893" observedRunningTime="2025-10-03 13:10:41.228459643 +0000 UTC m=+1249.632357468" watchObservedRunningTime="2025-10-03 13:10:41.229312346 +0000 UTC m=+1249.633210181" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.244052 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" podStartSLOduration=4.212914234 podStartE2EDuration="13.244033102s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.869955448 +0000 UTC m=+1238.273853283" lastFinishedPulling="2025-10-03 13:10:38.901074316 +0000 UTC m=+1247.304972151" observedRunningTime="2025-10-03 13:10:41.243054016 +0000 UTC m=+1249.646951861" watchObservedRunningTime="2025-10-03 13:10:41.244033102 +0000 UTC m=+1249.647930937" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.262594 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" podStartSLOduration=3.9509649319999998 podStartE2EDuration="13.262574121s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.629112284 +0000 UTC m=+1238.033010129" lastFinishedPulling="2025-10-03 13:10:38.940721483 +0000 UTC m=+1247.344619318" observedRunningTime="2025-10-03 13:10:41.258554173 +0000 UTC m=+1249.662452018" watchObservedRunningTime="2025-10-03 13:10:41.262574121 +0000 UTC m=+1249.666471956" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.287663 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" podStartSLOduration=3.756825776 podStartE2EDuration="13.287623296s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.35853675 +0000 UTC m=+1237.762434585" lastFinishedPulling="2025-10-03 13:10:38.88933426 +0000 UTC m=+1247.293232105" observedRunningTime="2025-10-03 13:10:41.283785272 +0000 UTC m=+1249.687683117" watchObservedRunningTime="2025-10-03 13:10:41.287623296 +0000 UTC m=+1249.691521121" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.311227 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" podStartSLOduration=4.025843148 podStartE2EDuration="13.311205521s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.628784925 +0000 UTC m=+1238.032682750" lastFinishedPulling="2025-10-03 13:10:38.914147288 +0000 UTC m=+1247.318045123" observedRunningTime="2025-10-03 13:10:41.311113178 +0000 UTC m=+1249.715011033" watchObservedRunningTime="2025-10-03 13:10:41.311205521 +0000 UTC m=+1249.715103366" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.335404 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" podStartSLOduration=4.167856871 podStartE2EDuration="13.335386832s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.721800689 +0000 UTC m=+1238.125698524" lastFinishedPulling="2025-10-03 13:10:38.88933064 +0000 UTC m=+1247.293228485" observedRunningTime="2025-10-03 13:10:41.333393578 +0000 UTC m=+1249.737291423" watchObservedRunningTime="2025-10-03 13:10:41.335386832 +0000 UTC m=+1249.739284667" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.369366 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" podStartSLOduration=4.530400241 podStartE2EDuration="13.369340666s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.077473364 +0000 UTC m=+1238.481371189" lastFinishedPulling="2025-10-03 13:10:38.916413789 +0000 UTC m=+1247.320311614" observedRunningTime="2025-10-03 13:10:41.348818083 +0000 UTC m=+1249.752715938" watchObservedRunningTime="2025-10-03 13:10:41.369340666 +0000 UTC m=+1249.773238501" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.371282 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" podStartSLOduration=4.531673305 podStartE2EDuration="13.371270258s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.073159338 +0000 UTC m=+1238.477057163" lastFinishedPulling="2025-10-03 13:10:38.912756291 +0000 UTC m=+1247.316654116" observedRunningTime="2025-10-03 13:10:41.365743909 +0000 UTC m=+1249.769641764" watchObservedRunningTime="2025-10-03 13:10:41.371270258 +0000 UTC m=+1249.775168093" Oct 03 13:10:41 crc kubenswrapper[4962]: I1003 13:10:41.384559 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" podStartSLOduration=4.196809911 podStartE2EDuration="13.384543375s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:29.752310241 +0000 UTC m=+1238.156208076" lastFinishedPulling="2025-10-03 13:10:38.940043705 +0000 UTC m=+1247.343941540" observedRunningTime="2025-10-03 13:10:41.382074579 +0000 UTC m=+1249.785972424" watchObservedRunningTime="2025-10-03 13:10:41.384543375 +0000 UTC m=+1249.788441210" Oct 03 13:10:46 crc kubenswrapper[4962]: I1003 13:10:46.272657 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" event={"ID":"908550fb-fd0f-4363-aaab-3434aae03751","Type":"ContainerStarted","Data":"3bab36701d5fba296de6177010b3a03cad3dcf2e246cfb9baf6a69a706912a9a"} Oct 03 13:10:46 crc kubenswrapper[4962]: I1003 13:10:46.276498 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" event={"ID":"fbda4a74-6131-47cb-9098-f23870f67916","Type":"ContainerStarted","Data":"3607a4675751832d91ae011f0ac417c538b40bf5c95dd18d2481369681250493"} Oct 03 13:10:46 crc kubenswrapper[4962]: I1003 13:10:46.276674 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" Oct 03 13:10:46 crc kubenswrapper[4962]: I1003 13:10:46.290736 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-j758z" podStartSLOduration=3.381614913 podStartE2EDuration="18.290718325s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.308845163 +0000 UTC m=+1238.712742998" lastFinishedPulling="2025-10-03 13:10:45.217948575 +0000 UTC m=+1253.621846410" observedRunningTime="2025-10-03 13:10:46.286305706 +0000 UTC m=+1254.690203541" watchObservedRunningTime="2025-10-03 13:10:46.290718325 +0000 UTC m=+1254.694616160" Oct 03 13:10:46 crc kubenswrapper[4962]: I1003 13:10:46.310268 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" podStartSLOduration=3.180357195 podStartE2EDuration="18.31025179s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.082532651 +0000 UTC m=+1238.486430486" lastFinishedPulling="2025-10-03 13:10:45.212427246 +0000 UTC m=+1253.616325081" observedRunningTime="2025-10-03 13:10:46.30466176 +0000 UTC m=+1254.708559605" watchObservedRunningTime="2025-10-03 13:10:46.31025179 +0000 UTC m=+1254.714149625" Oct 03 13:10:47 crc kubenswrapper[4962]: I1003 13:10:47.286139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" event={"ID":"c0193fb9-bb01-4730-a986-7f03c3b61887","Type":"ContainerStarted","Data":"e4377be9c6a68b54f5974a551fa498e1055a89ca2fb5be095ee24e4d4c3686ed"} Oct 03 13:10:47 crc kubenswrapper[4962]: I1003 13:10:47.286438 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:10:47 crc kubenswrapper[4962]: I1003 13:10:47.311166 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" podStartSLOduration=3.470306141 podStartE2EDuration="19.311147086s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.349158378 +0000 UTC m=+1238.753056213" lastFinishedPulling="2025-10-03 13:10:46.189999323 +0000 UTC m=+1254.593897158" observedRunningTime="2025-10-03 13:10:47.308197487 +0000 UTC m=+1255.712095322" watchObservedRunningTime="2025-10-03 13:10:47.311147086 +0000 UTC m=+1255.715044941" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.450391 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-hmld4" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.476851 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-np2sg" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.519161 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-7gwjc" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.541910 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pf6wd" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.600184 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-599898f689-82zrm" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.629439 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-6tjz7" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.768667 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-9lpjm" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.771264 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-nhcsm" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.885063 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-r52nf" Oct 03 13:10:48 crc kubenswrapper[4962]: I1003 13:10:48.906000 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-rdflf" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.001262 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-tnwbc" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.123295 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-jq2cl" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.157068 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tgrxn" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.301276 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" event={"ID":"74fa1f6e-ee12-4c7c-97f0-23586ec9983c","Type":"ContainerStarted","Data":"4ae96cd5313baff6f14830b1ebb7f603e5255d7f7678ae3d4b7c556563ef5776"} Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.301822 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.303505 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" event={"ID":"7ada00d0-e4a6-46a9-872b-e554866a03c6","Type":"ContainerStarted","Data":"f0472d4606b254d187417a50889215a6d015e527b0e7b5a2efd91e7d95bae054"} Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.303670 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.305287 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" event={"ID":"0f094302-eaab-4160-b549-530131588472","Type":"ContainerStarted","Data":"4a00de32e65ef48f266558947d3cd29193dbd754a9d2246e7986582cadda1ad0"} Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.305459 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.307180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" event={"ID":"0708bcfe-3c0e-44d8-b849-fa4464ea3387","Type":"ContainerStarted","Data":"df70d79ff66075c6c9947896a40ab5a44f0ff88564081cdc718573d0010d1105"} Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.307348 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.309008 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" event={"ID":"a062596b-8e37-4963-bf83-e37ed388bf83","Type":"ContainerStarted","Data":"a2852cb428eb0236413c08954b5f4d7b5859b7e0d6f18f92a8f83fd3c2b1d65b"} Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.309143 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.319824 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" podStartSLOduration=3.04529294 podStartE2EDuration="21.319805792s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.207488875 +0000 UTC m=+1238.611386710" lastFinishedPulling="2025-10-03 13:10:48.482001737 +0000 UTC m=+1256.885899562" observedRunningTime="2025-10-03 13:10:49.317662255 +0000 UTC m=+1257.721560110" watchObservedRunningTime="2025-10-03 13:10:49.319805792 +0000 UTC m=+1257.723703627" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.326959 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-n2xlz" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.333775 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" podStartSLOduration=2.933532481 podStartE2EDuration="21.333759178s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.082073238 +0000 UTC m=+1238.485971073" lastFinishedPulling="2025-10-03 13:10:48.482299935 +0000 UTC m=+1256.886197770" observedRunningTime="2025-10-03 13:10:49.330571822 +0000 UTC m=+1257.734469677" watchObservedRunningTime="2025-10-03 13:10:49.333759178 +0000 UTC m=+1257.737657013" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.347304 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" podStartSLOduration=3.22247203 podStartE2EDuration="21.347286662s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.338949434 +0000 UTC m=+1238.742847259" lastFinishedPulling="2025-10-03 13:10:48.463764056 +0000 UTC m=+1256.867661891" observedRunningTime="2025-10-03 13:10:49.342267197 +0000 UTC m=+1257.746165032" watchObservedRunningTime="2025-10-03 13:10:49.347286662 +0000 UTC m=+1257.751184497" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.379096 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" podStartSLOduration=2.993110375 podStartE2EDuration="21.379077948s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.08659766 +0000 UTC m=+1238.490495495" lastFinishedPulling="2025-10-03 13:10:48.472565223 +0000 UTC m=+1256.876463068" observedRunningTime="2025-10-03 13:10:49.355265737 +0000 UTC m=+1257.759163572" watchObservedRunningTime="2025-10-03 13:10:49.379077948 +0000 UTC m=+1257.782975783" Oct 03 13:10:49 crc kubenswrapper[4962]: I1003 13:10:49.383938 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" podStartSLOduration=3.069497581 podStartE2EDuration="21.383925218s" podCreationTimestamp="2025-10-03 13:10:28 +0000 UTC" firstStartedPulling="2025-10-03 13:10:30.211856022 +0000 UTC m=+1238.615753857" lastFinishedPulling="2025-10-03 13:10:48.526283659 +0000 UTC m=+1256.930181494" observedRunningTime="2025-10-03 13:10:49.376328544 +0000 UTC m=+1257.780226389" watchObservedRunningTime="2025-10-03 13:10:49.383925218 +0000 UTC m=+1257.787823053" Oct 03 13:10:54 crc kubenswrapper[4962]: I1003 13:10:54.659855 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:10:54 crc kubenswrapper[4962]: I1003 13:10:54.661409 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:10:59 crc kubenswrapper[4962]: I1003 13:10:59.013341 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-q95dg" Oct 03 13:10:59 crc kubenswrapper[4962]: I1003 13:10:59.141454 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-gm9lt" Oct 03 13:10:59 crc kubenswrapper[4962]: I1003 13:10:59.158129 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-7f4r6" Oct 03 13:10:59 crc kubenswrapper[4962]: I1003 13:10:59.301117 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-pgf8s" Oct 03 13:10:59 crc kubenswrapper[4962]: I1003 13:10:59.358837 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-2k5gq" Oct 03 13:10:59 crc kubenswrapper[4962]: I1003 13:10:59.389897 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-fcd7d9895-2fvpg" Oct 03 13:10:59 crc kubenswrapper[4962]: I1003 13:10:59.687479 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.485750 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r59c8"] Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.487482 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.489689 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.490300 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.492451 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.493098 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-gm9kw" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.493252 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r59c8"] Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.560268 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6lh7p"] Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.561788 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.564176 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.578031 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6lh7p"] Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.674493 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-config\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.674952 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11343823-c040-4e3f-b36c-33daeb2bc53c-config\") pod \"dnsmasq-dns-675f4bcbfc-r59c8\" (UID: \"11343823-c040-4e3f-b36c-33daeb2bc53c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.675062 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.675137 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptfjw\" (UniqueName: \"kubernetes.io/projected/11343823-c040-4e3f-b36c-33daeb2bc53c-kube-api-access-ptfjw\") pod \"dnsmasq-dns-675f4bcbfc-r59c8\" (UID: \"11343823-c040-4e3f-b36c-33daeb2bc53c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.675354 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2j4\" (UniqueName: \"kubernetes.io/projected/d4c317bd-17b1-43c7-b60d-04008bff1111-kube-api-access-cp2j4\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.776644 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.776696 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptfjw\" (UniqueName: \"kubernetes.io/projected/11343823-c040-4e3f-b36c-33daeb2bc53c-kube-api-access-ptfjw\") pod \"dnsmasq-dns-675f4bcbfc-r59c8\" (UID: \"11343823-c040-4e3f-b36c-33daeb2bc53c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.776772 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2j4\" (UniqueName: \"kubernetes.io/projected/d4c317bd-17b1-43c7-b60d-04008bff1111-kube-api-access-cp2j4\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.776801 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-config\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.776820 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11343823-c040-4e3f-b36c-33daeb2bc53c-config\") pod \"dnsmasq-dns-675f4bcbfc-r59c8\" (UID: \"11343823-c040-4e3f-b36c-33daeb2bc53c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.777909 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11343823-c040-4e3f-b36c-33daeb2bc53c-config\") pod \"dnsmasq-dns-675f4bcbfc-r59c8\" (UID: \"11343823-c040-4e3f-b36c-33daeb2bc53c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.778158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.778243 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-config\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.795520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2j4\" (UniqueName: \"kubernetes.io/projected/d4c317bd-17b1-43c7-b60d-04008bff1111-kube-api-access-cp2j4\") pod \"dnsmasq-dns-78dd6ddcc-6lh7p\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.802921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptfjw\" (UniqueName: \"kubernetes.io/projected/11343823-c040-4e3f-b36c-33daeb2bc53c-kube-api-access-ptfjw\") pod \"dnsmasq-dns-675f4bcbfc-r59c8\" (UID: \"11343823-c040-4e3f-b36c-33daeb2bc53c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.805450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:13 crc kubenswrapper[4962]: I1003 13:11:13.877777 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:14 crc kubenswrapper[4962]: I1003 13:11:14.251155 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r59c8"] Oct 03 13:11:14 crc kubenswrapper[4962]: I1003 13:11:14.366863 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6lh7p"] Oct 03 13:11:14 crc kubenswrapper[4962]: W1003 13:11:14.371508 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4c317bd_17b1_43c7_b60d_04008bff1111.slice/crio-c3ff2847f7b793afc4250272099934a30f858b5f04a366c6186067d9349c7ac5 WatchSource:0}: Error finding container c3ff2847f7b793afc4250272099934a30f858b5f04a366c6186067d9349c7ac5: Status 404 returned error can't find the container with id c3ff2847f7b793afc4250272099934a30f858b5f04a366c6186067d9349c7ac5 Oct 03 13:11:14 crc kubenswrapper[4962]: I1003 13:11:14.480912 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" event={"ID":"d4c317bd-17b1-43c7-b60d-04008bff1111","Type":"ContainerStarted","Data":"c3ff2847f7b793afc4250272099934a30f858b5f04a366c6186067d9349c7ac5"} Oct 03 13:11:14 crc kubenswrapper[4962]: I1003 13:11:14.481744 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" event={"ID":"11343823-c040-4e3f-b36c-33daeb2bc53c","Type":"ContainerStarted","Data":"69f4116a4e1bfaaaa5ec3906a047fad00cc1f889b4c0c1bf9dd3abc0a954e061"} Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.453822 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r59c8"] Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.496973 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mzgnf"] Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.498320 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.509520 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mzgnf"] Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.632673 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8fg\" (UniqueName: \"kubernetes.io/projected/a5fef104-2524-4ce3-9404-b4bbbdf139bd-kube-api-access-9v8fg\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.632800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-config\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.632882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.735532 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-config\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.735612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.735716 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8fg\" (UniqueName: \"kubernetes.io/projected/a5fef104-2524-4ce3-9404-b4bbbdf139bd-kube-api-access-9v8fg\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.736558 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-config\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.736928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.751692 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6lh7p"] Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.780819 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8fg\" (UniqueName: \"kubernetes.io/projected/a5fef104-2524-4ce3-9404-b4bbbdf139bd-kube-api-access-9v8fg\") pod \"dnsmasq-dns-666b6646f7-mzgnf\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.800856 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kfh8n"] Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.802091 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.816425 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kfh8n"] Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.833073 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.943556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpjbp\" (UniqueName: \"kubernetes.io/projected/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-kube-api-access-dpjbp\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.943613 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:16 crc kubenswrapper[4962]: I1003 13:11:16.943688 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-config\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.046652 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpjbp\" (UniqueName: \"kubernetes.io/projected/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-kube-api-access-dpjbp\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.046694 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.046736 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-config\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.048267 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-config\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.049175 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.077310 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpjbp\" (UniqueName: \"kubernetes.io/projected/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-kube-api-access-dpjbp\") pod \"dnsmasq-dns-57d769cc4f-kfh8n\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.136312 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.430787 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mzgnf"] Oct 03 13:11:17 crc kubenswrapper[4962]: W1003 13:11:17.447958 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5fef104_2524_4ce3_9404_b4bbbdf139bd.slice/crio-990e67fb0bb4d1cd445a48ae63f401ded0b315a39bbe6a427895272ba2e1869c WatchSource:0}: Error finding container 990e67fb0bb4d1cd445a48ae63f401ded0b315a39bbe6a427895272ba2e1869c: Status 404 returned error can't find the container with id 990e67fb0bb4d1cd445a48ae63f401ded0b315a39bbe6a427895272ba2e1869c Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.510505 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" event={"ID":"a5fef104-2524-4ce3-9404-b4bbbdf139bd","Type":"ContainerStarted","Data":"990e67fb0bb4d1cd445a48ae63f401ded0b315a39bbe6a427895272ba2e1869c"} Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.670012 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.671431 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.674316 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bnqk2" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.674610 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.674659 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.674686 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.676204 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 13:11:17 crc kubenswrapper[4962]: W1003 13:11:17.677196 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9733291_12c0_4d8d_9bd4_66b98b55b3ed.slice/crio-d3ead2e630d8c46d6290b28b24ff945b0a1a8b20cca32fd89363bfb55788d8b4 WatchSource:0}: Error finding container d3ead2e630d8c46d6290b28b24ff945b0a1a8b20cca32fd89363bfb55788d8b4: Status 404 returned error can't find the container with id d3ead2e630d8c46d6290b28b24ff945b0a1a8b20cca32fd89363bfb55788d8b4 Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.677979 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.680205 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.681317 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kfh8n"] Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.692753 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762403 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb9gx\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-kube-api-access-wb9gx\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762507 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762581 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762683 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762712 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/862ad9df-af58-4304-9ad5-7faba334e2d9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762780 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/862ad9df-af58-4304-9ad5-7faba334e2d9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762854 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.762900 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.864736 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.864801 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.864843 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/862ad9df-af58-4304-9ad5-7faba334e2d9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.864869 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.864909 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/862ad9df-af58-4304-9ad5-7faba334e2d9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.864944 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.864964 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.864986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.865010 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.865037 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9gx\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-kube-api-access-wb9gx\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.865082 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.866136 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.866130 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.866244 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.866257 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.866875 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.870341 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/862ad9df-af58-4304-9ad5-7faba334e2d9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.870496 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.873338 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.876211 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/862ad9df-af58-4304-9ad5-7faba334e2d9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.881621 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.903936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb9gx\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-kube-api-access-wb9gx\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.926862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " pod="openstack/rabbitmq-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.950675 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.952608 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.954970 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.955063 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.954975 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.954991 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.955410 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.955790 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.956594 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-frhnj" Oct 03 13:11:17 crc kubenswrapper[4962]: I1003 13:11:17.966811 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.016685 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068518 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068598 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068651 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068683 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068706 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068755 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068783 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068802 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/221bdd26-0fec-49e5-86ec-c2aefe7a5902-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqtcv\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-kube-api-access-lqtcv\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.068857 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/221bdd26-0fec-49e5-86ec-c2aefe7a5902-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.169950 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.169992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170015 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170040 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170066 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170085 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/221bdd26-0fec-49e5-86ec-c2aefe7a5902-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqtcv\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-kube-api-access-lqtcv\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170126 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/221bdd26-0fec-49e5-86ec-c2aefe7a5902-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170205 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170222 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170603 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.170710 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.171732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.171952 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.177957 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/221bdd26-0fec-49e5-86ec-c2aefe7a5902-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.178363 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.179183 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/221bdd26-0fec-49e5-86ec-c2aefe7a5902-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.180267 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.190017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.190289 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.192047 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqtcv\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-kube-api-access-lqtcv\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.194915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.289523 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.593323 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" event={"ID":"f9733291-12c0-4d8d-9bd4-66b98b55b3ed","Type":"ContainerStarted","Data":"d3ead2e630d8c46d6290b28b24ff945b0a1a8b20cca32fd89363bfb55788d8b4"} Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.606568 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 13:11:18 crc kubenswrapper[4962]: I1003 13:11:18.785909 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 13:11:18 crc kubenswrapper[4962]: W1003 13:11:18.823651 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221bdd26_0fec_49e5_86ec_c2aefe7a5902.slice/crio-8cd05986b0cc552f2360426afb7b5dcd08e94c8b1adb6a4e2542cae4b84a372b WatchSource:0}: Error finding container 8cd05986b0cc552f2360426afb7b5dcd08e94c8b1adb6a4e2542cae4b84a372b: Status 404 returned error can't find the container with id 8cd05986b0cc552f2360426afb7b5dcd08e94c8b1adb6a4e2542cae4b84a372b Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.607332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"862ad9df-af58-4304-9ad5-7faba334e2d9","Type":"ContainerStarted","Data":"3e73ccf3edd6cc4ebb1304265e6f6697b7c09c7e42268054a0ce04cc42476b09"} Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.608465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"221bdd26-0fec-49e5-86ec-c2aefe7a5902","Type":"ContainerStarted","Data":"8cd05986b0cc552f2360426afb7b5dcd08e94c8b1adb6a4e2542cae4b84a372b"} Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.648104 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.650679 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.652823 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lk4g2" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.652881 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.652938 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.653144 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.653981 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.663138 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.676403 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800252 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwc6r\" (UniqueName: \"kubernetes.io/projected/a3fb0456-394e-4041-829b-57c162966b2b-kube-api-access-dwc6r\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3fb0456-394e-4041-829b-57c162966b2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800444 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800474 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-secrets\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.800656 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.902795 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.903155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwc6r\" (UniqueName: \"kubernetes.io/projected/a3fb0456-394e-4041-829b-57c162966b2b-kube-api-access-dwc6r\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.903188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3fb0456-394e-4041-829b-57c162966b2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.903207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.903237 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.903264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.903308 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-secrets\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.903343 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.903385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.904500 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.904833 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3fb0456-394e-4041-829b-57c162966b2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.904960 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.905093 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.905299 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.908964 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.908966 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-secrets\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.913715 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.925420 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwc6r\" (UniqueName: \"kubernetes.io/projected/a3fb0456-394e-4041-829b-57c162966b2b-kube-api-access-dwc6r\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.930254 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " pod="openstack/openstack-galera-0" Oct 03 13:11:19 crc kubenswrapper[4962]: I1003 13:11:19.991415 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.410584 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.418522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.422858 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.423426 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.423571 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.424374 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.434565 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ch7gn" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513067 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513170 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513231 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513247 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/438da193-7b02-4101-a45c-9e0f83c41051-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513290 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513310 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.513329 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvjbg\" (UniqueName: \"kubernetes.io/projected/438da193-7b02-4101-a45c-9e0f83c41051-kube-api-access-vvjbg\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615230 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvjbg\" (UniqueName: \"kubernetes.io/projected/438da193-7b02-4101-a45c-9e0f83c41051-kube-api-access-vvjbg\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615304 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615374 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615395 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615414 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615437 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615502 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/438da193-7b02-4101-a45c-9e0f83c41051-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.615980 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/438da193-7b02-4101-a45c-9e0f83c41051-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.616982 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.616990 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.617145 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.619306 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.621256 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.637856 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.646322 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.670204 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvjbg\" (UniqueName: \"kubernetes.io/projected/438da193-7b02-4101-a45c-9e0f83c41051-kube-api-access-vvjbg\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.679517 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.766100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.773009 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.774250 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.783287 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.785058 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.785244 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.785415 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-rkwq4" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.825562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-config-data\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.825631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-kolla-config\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.825658 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.825728 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.825777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tv7c\" (UniqueName: \"kubernetes.io/projected/32e6592a-d206-4931-aa99-a84e041b05e4-kube-api-access-8tv7c\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.930370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tv7c\" (UniqueName: \"kubernetes.io/projected/32e6592a-d206-4931-aa99-a84e041b05e4-kube-api-access-8tv7c\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.930440 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-config-data\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.930506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-kolla-config\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.930525 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.930661 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.931838 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-kolla-config\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.932082 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-config-data\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.944797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.958091 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:20 crc kubenswrapper[4962]: I1003 13:11:20.959323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tv7c\" (UniqueName: \"kubernetes.io/projected/32e6592a-d206-4931-aa99-a84e041b05e4-kube-api-access-8tv7c\") pod \"memcached-0\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " pod="openstack/memcached-0" Oct 03 13:11:21 crc kubenswrapper[4962]: I1003 13:11:21.111049 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 13:11:22 crc kubenswrapper[4962]: I1003 13:11:22.423352 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:11:22 crc kubenswrapper[4962]: I1003 13:11:22.425075 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 13:11:22 crc kubenswrapper[4962]: I1003 13:11:22.428833 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dxpls" Oct 03 13:11:22 crc kubenswrapper[4962]: I1003 13:11:22.438731 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:11:22 crc kubenswrapper[4962]: I1003 13:11:22.569854 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzrnq\" (UniqueName: \"kubernetes.io/projected/dcbfa307-0a06-44c4-97f5-e25b6fdc50d5-kube-api-access-mzrnq\") pod \"kube-state-metrics-0\" (UID: \"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5\") " pod="openstack/kube-state-metrics-0" Oct 03 13:11:22 crc kubenswrapper[4962]: I1003 13:11:22.671550 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzrnq\" (UniqueName: \"kubernetes.io/projected/dcbfa307-0a06-44c4-97f5-e25b6fdc50d5-kube-api-access-mzrnq\") pod \"kube-state-metrics-0\" (UID: \"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5\") " pod="openstack/kube-state-metrics-0" Oct 03 13:11:22 crc kubenswrapper[4962]: I1003 13:11:22.689299 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzrnq\" (UniqueName: \"kubernetes.io/projected/dcbfa307-0a06-44c4-97f5-e25b6fdc50d5-kube-api-access-mzrnq\") pod \"kube-state-metrics-0\" (UID: \"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5\") " pod="openstack/kube-state-metrics-0" Oct 03 13:11:22 crc kubenswrapper[4962]: I1003 13:11:22.755414 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 13:11:24 crc kubenswrapper[4962]: I1003 13:11:24.659880 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:11:24 crc kubenswrapper[4962]: I1003 13:11:24.660252 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:11:24 crc kubenswrapper[4962]: I1003 13:11:24.660307 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:11:24 crc kubenswrapper[4962]: I1003 13:11:24.661193 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca8ebb170cb5bf8155325e7ea7c1aa3487bc412d9472e208bb48495e13806d06"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:11:24 crc kubenswrapper[4962]: I1003 13:11:24.661253 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://ca8ebb170cb5bf8155325e7ea7c1aa3487bc412d9472e208bb48495e13806d06" gracePeriod=600 Oct 03 13:11:25 crc kubenswrapper[4962]: I1003 13:11:25.707361 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="ca8ebb170cb5bf8155325e7ea7c1aa3487bc412d9472e208bb48495e13806d06" exitCode=0 Oct 03 13:11:25 crc kubenswrapper[4962]: I1003 13:11:25.707588 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"ca8ebb170cb5bf8155325e7ea7c1aa3487bc412d9472e208bb48495e13806d06"} Oct 03 13:11:25 crc kubenswrapper[4962]: I1003 13:11:25.707685 4962 scope.go:117] "RemoveContainer" containerID="a8662442e8f36173a3b3425f41847fc665cbcd80d634980f74f9a3c41a264cea" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.001601 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6sqdm"] Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.002831 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.005127 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-v9zds" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.006440 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.007652 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.012513 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6sqdm"] Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.048023 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-wvjpm"] Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.049780 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.064551 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wvjpm"] Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141148 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-log\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-combined-ca-bundle\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141213 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw7hs\" (UniqueName: \"kubernetes.io/projected/6d6f62dd-0720-46b6-b0a8-497490f052a8-kube-api-access-nw7hs\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141248 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-ovn-controller-tls-certs\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-log-ovn\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141328 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-lib\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141346 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141362 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-run\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141400 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141423 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-etc-ovs\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141445 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmq2z\" (UniqueName: \"kubernetes.io/projected/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-kube-api-access-jmq2z\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141496 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-scripts\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.141513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run-ovn\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.242943 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-lib\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.242983 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-run\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243041 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243066 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-etc-ovs\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243090 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmq2z\" (UniqueName: \"kubernetes.io/projected/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-kube-api-access-jmq2z\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243113 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-scripts\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243129 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run-ovn\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-log\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-combined-ca-bundle\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw7hs\" (UniqueName: \"kubernetes.io/projected/6d6f62dd-0720-46b6-b0a8-497490f052a8-kube-api-access-nw7hs\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243216 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-ovn-controller-tls-certs\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-log-ovn\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243461 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243573 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-lib\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.243598 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-log-ovn\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.244078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-log\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.244135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-etc-ovs\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.244205 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-run\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.244535 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run-ovn\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.246325 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.248324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-scripts\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.248818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-ovn-controller-tls-certs\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.260987 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmq2z\" (UniqueName: \"kubernetes.io/projected/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-kube-api-access-jmq2z\") pod \"ovn-controller-ovs-wvjpm\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.261346 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-combined-ca-bundle\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.268270 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw7hs\" (UniqueName: \"kubernetes.io/projected/6d6f62dd-0720-46b6-b0a8-497490f052a8-kube-api-access-nw7hs\") pod \"ovn-controller-6sqdm\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.323846 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.376113 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.881359 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.883332 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.886492 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.886847 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.886989 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.887099 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cwxcj" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.887993 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.900814 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.955337 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4slpf\" (UniqueName: \"kubernetes.io/projected/2af174c7-cf23-452c-bc13-ecda2775d58d-kube-api-access-4slpf\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.955400 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.955432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-config\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.955453 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.955513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.955591 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.955665 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:26 crc kubenswrapper[4962]: I1003 13:11:26.955822 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.057354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.057849 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.065032 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.065209 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.065252 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.065306 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4slpf\" (UniqueName: \"kubernetes.io/projected/2af174c7-cf23-452c-bc13-ecda2775d58d-kube-api-access-4slpf\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.065345 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.065381 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-config\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.065406 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.066059 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.066511 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.067100 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-config\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.072278 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.073265 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.074896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.084828 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4slpf\" (UniqueName: \"kubernetes.io/projected/2af174c7-cf23-452c-bc13-ecda2775d58d-kube-api-access-4slpf\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.094010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.205047 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:27 crc kubenswrapper[4962]: I1003 13:11:27.418474 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.438174 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.444578 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.444719 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.451470 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6mlth" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.451551 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.453005 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.453043 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.542846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.542896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.542940 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-config\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.542977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.543035 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.543062 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c7fm\" (UniqueName: \"kubernetes.io/projected/6313803e-1bf1-4a99-8af7-cb80c0e6321c-kube-api-access-2c7fm\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.543084 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.543116 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645044 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645089 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645136 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-config\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645163 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c7fm\" (UniqueName: \"kubernetes.io/projected/6313803e-1bf1-4a99-8af7-cb80c0e6321c-kube-api-access-2c7fm\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645239 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645266 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.645477 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.647270 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-config\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.647322 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.647464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.656592 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.656684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.656693 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.667103 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.669446 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c7fm\" (UniqueName: \"kubernetes.io/projected/6313803e-1bf1-4a99-8af7-cb80c0e6321c-kube-api-access-2c7fm\") pod \"ovsdbserver-sb-0\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:30 crc kubenswrapper[4962]: I1003 13:11:30.768116 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:33 crc kubenswrapper[4962]: W1003 13:11:33.384629 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32e6592a_d206_4931_aa99_a84e041b05e4.slice/crio-1a79561749c5439f97032e42cf2641ba2b025f913febe3f8c59942dab013b58b WatchSource:0}: Error finding container 1a79561749c5439f97032e42cf2641ba2b025f913febe3f8c59942dab013b58b: Status 404 returned error can't find the container with id 1a79561749c5439f97032e42cf2641ba2b025f913febe3f8c59942dab013b58b Oct 03 13:11:33 crc kubenswrapper[4962]: I1003 13:11:33.760408 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"32e6592a-d206-4931-aa99-a84e041b05e4","Type":"ContainerStarted","Data":"1a79561749c5439f97032e42cf2641ba2b025f913febe3f8c59942dab013b58b"} Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.293509 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.294090 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dpjbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-kfh8n_openstack(f9733291-12c0-4d8d-9bd4-66b98b55b3ed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.295383 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.318063 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.318248 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9v8fg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-mzgnf_openstack(a5fef104-2524-4ce3-9404-b4bbbdf139bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.320114 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.371266 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.371443 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cp2j4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-6lh7p_openstack(d4c317bd-17b1-43c7-b60d-04008bff1111): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.373189 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" podUID="d4c317bd-17b1-43c7-b60d-04008bff1111" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.403628 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.404053 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptfjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-r59c8_openstack(11343823-c040-4e3f-b36c-33daeb2bc53c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.405975 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" podUID="11343823-c040-4e3f-b36c-33daeb2bc53c" Oct 03 13:11:34 crc kubenswrapper[4962]: I1003 13:11:34.765463 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 13:11:34 crc kubenswrapper[4962]: W1003 13:11:34.784301 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod438da193_7b02_4101_a45c_9e0f83c41051.slice/crio-c9f6d0d07695070d1b3ee0e356d4ee2c319299ae22d961918c739d6bca0a5f50 WatchSource:0}: Error finding container c9f6d0d07695070d1b3ee0e356d4ee2c319299ae22d961918c739d6bca0a5f50: Status 404 returned error can't find the container with id c9f6d0d07695070d1b3ee0e356d4ee2c319299ae22d961918c739d6bca0a5f50 Oct 03 13:11:34 crc kubenswrapper[4962]: I1003 13:11:34.794757 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c"} Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.796603 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" Oct 03 13:11:34 crc kubenswrapper[4962]: E1003 13:11:34.797358 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" Oct 03 13:11:34 crc kubenswrapper[4962]: I1003 13:11:34.907471 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.039506 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.265365 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.286197 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6sqdm"] Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.359889 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 13:11:35 crc kubenswrapper[4962]: W1003 13:11:35.373590 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3fb0456_394e_4041_829b_57c162966b2b.slice/crio-36d786ae84adb0baff5f0e6b75cba07aa00cfbafa3a40d1b7db2376679095f6c WatchSource:0}: Error finding container 36d786ae84adb0baff5f0e6b75cba07aa00cfbafa3a40d1b7db2376679095f6c: Status 404 returned error can't find the container with id 36d786ae84adb0baff5f0e6b75cba07aa00cfbafa3a40d1b7db2376679095f6c Oct 03 13:11:35 crc kubenswrapper[4962]: W1003 13:11:35.376620 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d6f62dd_0720_46b6_b0a8_497490f052a8.slice/crio-2323181f6574e10b42ffdba92dc7bb0bae9b27ba783b4d5f5147922b762c9b9b WatchSource:0}: Error finding container 2323181f6574e10b42ffdba92dc7bb0bae9b27ba783b4d5f5147922b762c9b9b: Status 404 returned error can't find the container with id 2323181f6574e10b42ffdba92dc7bb0bae9b27ba783b4d5f5147922b762c9b9b Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.384012 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:35 crc kubenswrapper[4962]: W1003 13:11:35.384271 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6313803e_1bf1_4a99_8af7_cb80c0e6321c.slice/crio-b9a76d889ebb424a0ed720a6e50124c449e1a1c1dd26e18bc04643ed5691d0e6 WatchSource:0}: Error finding container b9a76d889ebb424a0ed720a6e50124c449e1a1c1dd26e18bc04643ed5691d0e6: Status 404 returned error can't find the container with id b9a76d889ebb424a0ed720a6e50124c449e1a1c1dd26e18bc04643ed5691d0e6 Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.388910 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.433442 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-dns-svc\") pod \"d4c317bd-17b1-43c7-b60d-04008bff1111\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.433522 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-config\") pod \"d4c317bd-17b1-43c7-b60d-04008bff1111\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.433589 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp2j4\" (UniqueName: \"kubernetes.io/projected/d4c317bd-17b1-43c7-b60d-04008bff1111-kube-api-access-cp2j4\") pod \"d4c317bd-17b1-43c7-b60d-04008bff1111\" (UID: \"d4c317bd-17b1-43c7-b60d-04008bff1111\") " Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.433966 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4c317bd-17b1-43c7-b60d-04008bff1111" (UID: "d4c317bd-17b1-43c7-b60d-04008bff1111"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.434393 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-config" (OuterVolumeSpecName: "config") pod "d4c317bd-17b1-43c7-b60d-04008bff1111" (UID: "d4c317bd-17b1-43c7-b60d-04008bff1111"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.441860 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4c317bd-17b1-43c7-b60d-04008bff1111-kube-api-access-cp2j4" (OuterVolumeSpecName: "kube-api-access-cp2j4") pod "d4c317bd-17b1-43c7-b60d-04008bff1111" (UID: "d4c317bd-17b1-43c7-b60d-04008bff1111"). InnerVolumeSpecName "kube-api-access-cp2j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.534836 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11343823-c040-4e3f-b36c-33daeb2bc53c-config\") pod \"11343823-c040-4e3f-b36c-33daeb2bc53c\" (UID: \"11343823-c040-4e3f-b36c-33daeb2bc53c\") " Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.534910 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptfjw\" (UniqueName: \"kubernetes.io/projected/11343823-c040-4e3f-b36c-33daeb2bc53c-kube-api-access-ptfjw\") pod \"11343823-c040-4e3f-b36c-33daeb2bc53c\" (UID: \"11343823-c040-4e3f-b36c-33daeb2bc53c\") " Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.535316 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.535336 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4c317bd-17b1-43c7-b60d-04008bff1111-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.535351 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp2j4\" (UniqueName: \"kubernetes.io/projected/d4c317bd-17b1-43c7-b60d-04008bff1111-kube-api-access-cp2j4\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.535418 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11343823-c040-4e3f-b36c-33daeb2bc53c-config" (OuterVolumeSpecName: "config") pod "11343823-c040-4e3f-b36c-33daeb2bc53c" (UID: "11343823-c040-4e3f-b36c-33daeb2bc53c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.567069 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11343823-c040-4e3f-b36c-33daeb2bc53c-kube-api-access-ptfjw" (OuterVolumeSpecName: "kube-api-access-ptfjw") pod "11343823-c040-4e3f-b36c-33daeb2bc53c" (UID: "11343823-c040-4e3f-b36c-33daeb2bc53c"). InnerVolumeSpecName "kube-api-access-ptfjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.636883 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11343823-c040-4e3f-b36c-33daeb2bc53c-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.636922 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptfjw\" (UniqueName: \"kubernetes.io/projected/11343823-c040-4e3f-b36c-33daeb2bc53c-kube-api-access-ptfjw\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.804813 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"862ad9df-af58-4304-9ad5-7faba334e2d9","Type":"ContainerStarted","Data":"db6803eb436ec4541ae5c4083b22b4f72e755e3175f11d228ed92e5f3fa9bc0c"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.806314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5","Type":"ContainerStarted","Data":"a78da3cf57fe279ef38d3f8756941cadfb67487891686452ed9deb81f7028826"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.807482 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2af174c7-cf23-452c-bc13-ecda2775d58d","Type":"ContainerStarted","Data":"f7edaafcf712e9901b44555a1a8056a3daf3444e4bf017a4a0cf844d35572fdb"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.808715 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.808733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6lh7p" event={"ID":"d4c317bd-17b1-43c7-b60d-04008bff1111","Type":"ContainerDied","Data":"c3ff2847f7b793afc4250272099934a30f858b5f04a366c6186067d9349c7ac5"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.811195 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6313803e-1bf1-4a99-8af7-cb80c0e6321c","Type":"ContainerStarted","Data":"b9a76d889ebb424a0ed720a6e50124c449e1a1c1dd26e18bc04643ed5691d0e6"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.812743 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"438da193-7b02-4101-a45c-9e0f83c41051","Type":"ContainerStarted","Data":"c9f6d0d07695070d1b3ee0e356d4ee2c319299ae22d961918c739d6bca0a5f50"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.814418 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm" event={"ID":"6d6f62dd-0720-46b6-b0a8-497490f052a8","Type":"ContainerStarted","Data":"2323181f6574e10b42ffdba92dc7bb0bae9b27ba783b4d5f5147922b762c9b9b"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.816360 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3fb0456-394e-4041-829b-57c162966b2b","Type":"ContainerStarted","Data":"36d786ae84adb0baff5f0e6b75cba07aa00cfbafa3a40d1b7db2376679095f6c"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.817932 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"221bdd26-0fec-49e5-86ec-c2aefe7a5902","Type":"ContainerStarted","Data":"d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.819780 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" event={"ID":"11343823-c040-4e3f-b36c-33daeb2bc53c","Type":"ContainerDied","Data":"69f4116a4e1bfaaaa5ec3906a047fad00cc1f889b4c0c1bf9dd3abc0a954e061"} Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.819834 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r59c8" Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.917324 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6lh7p"] Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.928133 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6lh7p"] Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.942889 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r59c8"] Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.958336 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wvjpm"] Oct 03 13:11:35 crc kubenswrapper[4962]: I1003 13:11:35.975063 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r59c8"] Oct 03 13:11:36 crc kubenswrapper[4962]: I1003 13:11:36.239345 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11343823-c040-4e3f-b36c-33daeb2bc53c" path="/var/lib/kubelet/pods/11343823-c040-4e3f-b36c-33daeb2bc53c/volumes" Oct 03 13:11:36 crc kubenswrapper[4962]: I1003 13:11:36.240242 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4c317bd-17b1-43c7-b60d-04008bff1111" path="/var/lib/kubelet/pods/d4c317bd-17b1-43c7-b60d-04008bff1111/volumes" Oct 03 13:11:36 crc kubenswrapper[4962]: I1003 13:11:36.842690 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvjpm" event={"ID":"7cb4dab0-1ffc-49d4-a229-1862a33d4caa","Type":"ContainerStarted","Data":"1e3bc9bb11b62b93d85f27f8127b612b45f887313c177783796dc37ea59289b2"} Oct 03 13:11:37 crc kubenswrapper[4962]: I1003 13:11:37.853399 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"32e6592a-d206-4931-aa99-a84e041b05e4","Type":"ContainerStarted","Data":"016f210b8c91d8e9c5f93eeed437c69115693182129858f3582cdcd912c4dd79"} Oct 03 13:11:37 crc kubenswrapper[4962]: I1003 13:11:37.853753 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 13:11:37 crc kubenswrapper[4962]: I1003 13:11:37.869506 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.843394052 podStartE2EDuration="17.869487938s" podCreationTimestamp="2025-10-03 13:11:20 +0000 UTC" firstStartedPulling="2025-10-03 13:11:33.414804812 +0000 UTC m=+1301.818702667" lastFinishedPulling="2025-10-03 13:11:36.440898718 +0000 UTC m=+1304.844796553" observedRunningTime="2025-10-03 13:11:37.867957797 +0000 UTC m=+1306.271855652" watchObservedRunningTime="2025-10-03 13:11:37.869487938 +0000 UTC m=+1306.273385783" Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.887316 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2af174c7-cf23-452c-bc13-ecda2775d58d","Type":"ContainerStarted","Data":"cc095d6f6d8b5824a32ad688e66fd5a34700bd68a8048d5bb8c9727930860221"} Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.889779 4962 generic.go:334] "Generic (PLEG): container finished" podID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerID="66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6" exitCode=0 Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.889921 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvjpm" event={"ID":"7cb4dab0-1ffc-49d4-a229-1862a33d4caa","Type":"ContainerDied","Data":"66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6"} Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.893202 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6313803e-1bf1-4a99-8af7-cb80c0e6321c","Type":"ContainerStarted","Data":"40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85"} Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.894927 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"438da193-7b02-4101-a45c-9e0f83c41051","Type":"ContainerStarted","Data":"d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02"} Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.897740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5","Type":"ContainerStarted","Data":"0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801"} Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.897856 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.900292 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm" event={"ID":"6d6f62dd-0720-46b6-b0a8-497490f052a8","Type":"ContainerStarted","Data":"16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12"} Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.900405 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6sqdm" Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.902539 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3fb0456-394e-4041-829b-57c162966b2b","Type":"ContainerStarted","Data":"7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e"} Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.927290 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6sqdm" podStartSLOduration=11.549712929 podStartE2EDuration="17.92727234s" podCreationTimestamp="2025-10-03 13:11:25 +0000 UTC" firstStartedPulling="2025-10-03 13:11:35.380607405 +0000 UTC m=+1303.784505240" lastFinishedPulling="2025-10-03 13:11:41.758166806 +0000 UTC m=+1310.162064651" observedRunningTime="2025-10-03 13:11:42.921030892 +0000 UTC m=+1311.324928727" watchObservedRunningTime="2025-10-03 13:11:42.92727234 +0000 UTC m=+1311.331170175" Oct 03 13:11:42 crc kubenswrapper[4962]: I1003 13:11:42.986574 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.090084575 podStartE2EDuration="20.986557057s" podCreationTimestamp="2025-10-03 13:11:22 +0000 UTC" firstStartedPulling="2025-10-03 13:11:34.946145598 +0000 UTC m=+1303.350043443" lastFinishedPulling="2025-10-03 13:11:41.84261808 +0000 UTC m=+1310.246515925" observedRunningTime="2025-10-03 13:11:42.978594332 +0000 UTC m=+1311.382492167" watchObservedRunningTime="2025-10-03 13:11:42.986557057 +0000 UTC m=+1311.390454882" Oct 03 13:11:43 crc kubenswrapper[4962]: I1003 13:11:43.915133 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvjpm" event={"ID":"7cb4dab0-1ffc-49d4-a229-1862a33d4caa","Type":"ContainerStarted","Data":"923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac"} Oct 03 13:11:43 crc kubenswrapper[4962]: I1003 13:11:43.915454 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvjpm" event={"ID":"7cb4dab0-1ffc-49d4-a229-1862a33d4caa","Type":"ContainerStarted","Data":"34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9"} Oct 03 13:11:43 crc kubenswrapper[4962]: I1003 13:11:43.915626 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:43 crc kubenswrapper[4962]: I1003 13:11:43.915663 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:11:43 crc kubenswrapper[4962]: I1003 13:11:43.938721 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-wvjpm" podStartSLOduration=12.890457084 podStartE2EDuration="17.938698289s" podCreationTimestamp="2025-10-03 13:11:26 +0000 UTC" firstStartedPulling="2025-10-03 13:11:36.294910548 +0000 UTC m=+1304.698808383" lastFinishedPulling="2025-10-03 13:11:41.343151753 +0000 UTC m=+1309.747049588" observedRunningTime="2025-10-03 13:11:43.932673426 +0000 UTC m=+1312.336571261" watchObservedRunningTime="2025-10-03 13:11:43.938698289 +0000 UTC m=+1312.342596144" Oct 03 13:11:46 crc kubenswrapper[4962]: I1003 13:11:46.112150 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 13:11:47 crc kubenswrapper[4962]: I1003 13:11:47.946688 4962 generic.go:334] "Generic (PLEG): container finished" podID="438da193-7b02-4101-a45c-9e0f83c41051" containerID="d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02" exitCode=0 Oct 03 13:11:47 crc kubenswrapper[4962]: I1003 13:11:47.946738 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"438da193-7b02-4101-a45c-9e0f83c41051","Type":"ContainerDied","Data":"d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02"} Oct 03 13:11:48 crc kubenswrapper[4962]: I1003 13:11:48.955301 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3fb0456-394e-4041-829b-57c162966b2b" containerID="7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e" exitCode=0 Oct 03 13:11:48 crc kubenswrapper[4962]: I1003 13:11:48.955395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3fb0456-394e-4041-829b-57c162966b2b","Type":"ContainerDied","Data":"7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e"} Oct 03 13:11:50 crc kubenswrapper[4962]: I1003 13:11:50.984243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2af174c7-cf23-452c-bc13-ecda2775d58d","Type":"ContainerStarted","Data":"706277ad3b96d3b0d8687160b11c07b80babd5d0827c39c2e0bc9cc9e42f7d03"} Oct 03 13:11:50 crc kubenswrapper[4962]: I1003 13:11:50.988410 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6313803e-1bf1-4a99-8af7-cb80c0e6321c","Type":"ContainerStarted","Data":"d2fb6e730baadf5cce5c8d3a7e70507b921c0b75296b297850b5803cf0722b8a"} Oct 03 13:11:50 crc kubenswrapper[4962]: I1003 13:11:50.993712 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" event={"ID":"a5fef104-2524-4ce3-9404-b4bbbdf139bd","Type":"ContainerStarted","Data":"f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047"} Oct 03 13:11:50 crc kubenswrapper[4962]: I1003 13:11:50.996725 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"438da193-7b02-4101-a45c-9e0f83c41051","Type":"ContainerStarted","Data":"975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f"} Oct 03 13:11:50 crc kubenswrapper[4962]: I1003 13:11:50.998400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" event={"ID":"f9733291-12c0-4d8d-9bd4-66b98b55b3ed","Type":"ContainerStarted","Data":"017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37"} Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.007746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3fb0456-394e-4041-829b-57c162966b2b","Type":"ContainerStarted","Data":"b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac"} Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.008328 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.297007164 podStartE2EDuration="26.008317872s" podCreationTimestamp="2025-10-03 13:11:25 +0000 UTC" firstStartedPulling="2025-10-03 13:11:35.058505913 +0000 UTC m=+1303.462403748" lastFinishedPulling="2025-10-03 13:11:50.769816621 +0000 UTC m=+1319.173714456" observedRunningTime="2025-10-03 13:11:51.002511736 +0000 UTC m=+1319.406409571" watchObservedRunningTime="2025-10-03 13:11:51.008317872 +0000 UTC m=+1319.412215707" Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.043970 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.664135058 podStartE2EDuration="22.043955162s" podCreationTimestamp="2025-10-03 13:11:29 +0000 UTC" firstStartedPulling="2025-10-03 13:11:35.390383278 +0000 UTC m=+1303.794281113" lastFinishedPulling="2025-10-03 13:11:50.770203382 +0000 UTC m=+1319.174101217" observedRunningTime="2025-10-03 13:11:51.037085417 +0000 UTC m=+1319.440983252" watchObservedRunningTime="2025-10-03 13:11:51.043955162 +0000 UTC m=+1319.447852987" Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.078757 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.107819212 podStartE2EDuration="32.078737098s" podCreationTimestamp="2025-10-03 13:11:19 +0000 UTC" firstStartedPulling="2025-10-03 13:11:34.786915241 +0000 UTC m=+1303.190813076" lastFinishedPulling="2025-10-03 13:11:41.757833127 +0000 UTC m=+1310.161730962" observedRunningTime="2025-10-03 13:11:51.073144507 +0000 UTC m=+1319.477042342" watchObservedRunningTime="2025-10-03 13:11:51.078737098 +0000 UTC m=+1319.482634933" Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.103823 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.077983101 podStartE2EDuration="33.103807263s" podCreationTimestamp="2025-10-03 13:11:18 +0000 UTC" firstStartedPulling="2025-10-03 13:11:35.378271982 +0000 UTC m=+1303.782169817" lastFinishedPulling="2025-10-03 13:11:41.404096144 +0000 UTC m=+1309.807993979" observedRunningTime="2025-10-03 13:11:51.10220287 +0000 UTC m=+1319.506100705" watchObservedRunningTime="2025-10-03 13:11:51.103807263 +0000 UTC m=+1319.507705088" Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.206077 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.238292 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.768885 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:51 crc kubenswrapper[4962]: I1003 13:11:51.811962 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.016324 4962 generic.go:334] "Generic (PLEG): container finished" podID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" containerID="017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37" exitCode=0 Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.016403 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" event={"ID":"f9733291-12c0-4d8d-9bd4-66b98b55b3ed","Type":"ContainerDied","Data":"017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37"} Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.020044 4962 generic.go:334] "Generic (PLEG): container finished" podID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" containerID="f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047" exitCode=0 Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.020146 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" event={"ID":"a5fef104-2524-4ce3-9404-b4bbbdf139bd","Type":"ContainerDied","Data":"f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047"} Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.020194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" event={"ID":"a5fef104-2524-4ce3-9404-b4bbbdf139bd","Type":"ContainerStarted","Data":"1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d"} Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.020512 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.020782 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.064894 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" podStartSLOduration=2.740495332 podStartE2EDuration="36.064875626s" podCreationTimestamp="2025-10-03 13:11:16 +0000 UTC" firstStartedPulling="2025-10-03 13:11:17.450071212 +0000 UTC m=+1285.853969047" lastFinishedPulling="2025-10-03 13:11:50.774451516 +0000 UTC m=+1319.178349341" observedRunningTime="2025-10-03 13:11:52.059417059 +0000 UTC m=+1320.463314904" watchObservedRunningTime="2025-10-03 13:11:52.064875626 +0000 UTC m=+1320.468773461" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.075293 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.077777 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.348813 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kfh8n"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.409250 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9h8s9"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.410554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.412016 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.417648 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-l2mcn"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.418903 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.421077 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.431283 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9h8s9"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.453231 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-l2mcn"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529433 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovs-rundir\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-config\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b77bc9-27ae-4994-86c2-614e48ad33c6-config\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529859 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldg6\" (UniqueName: \"kubernetes.io/projected/67b77bc9-27ae-4994-86c2-614e48ad33c6-kube-api-access-jldg6\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovn-rundir\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529935 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j4kl\" (UniqueName: \"kubernetes.io/projected/688c3104-6239-4ef4-923d-b169172a07e3-kube-api-access-4j4kl\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529973 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-combined-ca-bundle\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.529994 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.541110 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mzgnf"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.570279 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.572042 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.577787 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.577818 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.577836 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5vqpt" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.578009 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.588689 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.600987 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-mwsmg"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.602347 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.605133 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.609700 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-mwsmg"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634426 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634481 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j4kl\" (UniqueName: \"kubernetes.io/projected/688c3104-6239-4ef4-923d-b169172a07e3-kube-api-access-4j4kl\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634527 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-combined-ca-bundle\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634542 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634564 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634599 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovs-rundir\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634682 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634703 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-config\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634721 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634748 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b77bc9-27ae-4994-86c2-614e48ad33c6-config\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634770 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jldg6\" (UniqueName: \"kubernetes.io/projected/67b77bc9-27ae-4994-86c2-614e48ad33c6-kube-api-access-jldg6\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634794 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-config\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634810 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92txr\" (UniqueName: \"kubernetes.io/projected/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-kube-api-access-92txr\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634881 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-scripts\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.634910 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovn-rundir\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.635200 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovn-rundir\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.637842 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-config\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.640417 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b77bc9-27ae-4994-86c2-614e48ad33c6-config\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.641084 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovs-rundir\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.641380 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.641752 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.642422 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-combined-ca-bundle\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.657423 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j4kl\" (UniqueName: \"kubernetes.io/projected/688c3104-6239-4ef4-923d-b169172a07e3-kube-api-access-4j4kl\") pod \"dnsmasq-dns-7f896c8c65-l2mcn\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.661575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldg6\" (UniqueName: \"kubernetes.io/projected/67b77bc9-27ae-4994-86c2-614e48ad33c6-kube-api-access-jldg6\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.662159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9h8s9\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.731854 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738539 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-config\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738610 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-config\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738761 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738791 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92txr\" (UniqueName: \"kubernetes.io/projected/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-kube-api-access-92txr\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738810 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-scripts\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738839 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738951 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.738978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dphb\" (UniqueName: \"kubernetes.io/projected/cde9ffcd-ddb9-4114-bf06-11c230347438-kube-api-access-9dphb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.739005 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.739513 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.740490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-config\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.744678 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-scripts\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.750279 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.752323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.758347 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.760335 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.762227 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.775190 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92txr\" (UniqueName: \"kubernetes.io/projected/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-kube-api-access-92txr\") pod \"ovn-northd-0\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.840881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dphb\" (UniqueName: \"kubernetes.io/projected/cde9ffcd-ddb9-4114-bf06-11c230347438-kube-api-access-9dphb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.841182 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.841211 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-config\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.841292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.841313 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.842021 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.842377 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.842516 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-config\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.843041 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.870439 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dphb\" (UniqueName: \"kubernetes.io/projected/cde9ffcd-ddb9-4114-bf06-11c230347438-kube-api-access-9dphb\") pod \"dnsmasq-dns-86db49b7ff-mwsmg\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.881865 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-l2mcn"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.898084 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.905692 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-8rz95"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.910506 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.922100 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8rz95"] Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.926289 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.957141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.957188 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-config\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.957258 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.957289 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbq6m\" (UniqueName: \"kubernetes.io/projected/ed812fda-dbe5-48ff-bea4-3795a87d716f-kube-api-access-kbq6m\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:52 crc kubenswrapper[4962]: I1003 13:11:52.957324 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-dns-svc\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.052100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" event={"ID":"f9733291-12c0-4d8d-9bd4-66b98b55b3ed","Type":"ContainerStarted","Data":"5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338"} Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.052211 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" containerName="dnsmasq-dns" containerID="cri-o://5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338" gracePeriod=10 Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.052473 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.052771 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" containerName="dnsmasq-dns" containerID="cri-o://1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d" gracePeriod=10 Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.052826 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.058142 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-dns-svc\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.058408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.058436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-config\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.058505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.058536 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbq6m\" (UniqueName: \"kubernetes.io/projected/ed812fda-dbe5-48ff-bea4-3795a87d716f-kube-api-access-kbq6m\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.059617 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.060155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-dns-svc\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.060452 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-config\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.060793 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.078291 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" podStartSLOduration=3.989635322 podStartE2EDuration="37.078271309s" podCreationTimestamp="2025-10-03 13:11:16 +0000 UTC" firstStartedPulling="2025-10-03 13:11:17.681890373 +0000 UTC m=+1286.085788228" lastFinishedPulling="2025-10-03 13:11:50.77052639 +0000 UTC m=+1319.174424215" observedRunningTime="2025-10-03 13:11:53.075943076 +0000 UTC m=+1321.479840911" watchObservedRunningTime="2025-10-03 13:11:53.078271309 +0000 UTC m=+1321.482169144" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.115830 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbq6m\" (UniqueName: \"kubernetes.io/projected/ed812fda-dbe5-48ff-bea4-3795a87d716f-kube-api-access-kbq6m\") pod \"dnsmasq-dns-698758b865-8rz95\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.246085 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.407214 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9h8s9"] Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.602045 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.683622 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-dns-svc\") pod \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.684048 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-config\") pod \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.684078 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v8fg\" (UniqueName: \"kubernetes.io/projected/a5fef104-2524-4ce3-9404-b4bbbdf139bd-kube-api-access-9v8fg\") pod \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\" (UID: \"a5fef104-2524-4ce3-9404-b4bbbdf139bd\") " Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.710941 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5fef104-2524-4ce3-9404-b4bbbdf139bd-kube-api-access-9v8fg" (OuterVolumeSpecName: "kube-api-access-9v8fg") pod "a5fef104-2524-4ce3-9404-b4bbbdf139bd" (UID: "a5fef104-2524-4ce3-9404-b4bbbdf139bd"). InnerVolumeSpecName "kube-api-access-9v8fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.797545 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v8fg\" (UniqueName: \"kubernetes.io/projected/a5fef104-2524-4ce3-9404-b4bbbdf139bd-kube-api-access-9v8fg\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.838499 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-config" (OuterVolumeSpecName: "config") pod "a5fef104-2524-4ce3-9404-b4bbbdf139bd" (UID: "a5fef104-2524-4ce3-9404-b4bbbdf139bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.838561 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.847500 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a5fef104-2524-4ce3-9404-b4bbbdf139bd" (UID: "a5fef104-2524-4ce3-9404-b4bbbdf139bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.851245 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-l2mcn"] Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.888858 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-mwsmg"] Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.900946 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.900984 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5fef104-2524-4ce3-9404-b4bbbdf139bd-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.932868 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.974706 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 03 13:11:53 crc kubenswrapper[4962]: E1003 13:11:53.975078 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" containerName="dnsmasq-dns" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.975095 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" containerName="dnsmasq-dns" Oct 03 13:11:53 crc kubenswrapper[4962]: E1003 13:11:53.975119 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" containerName="dnsmasq-dns" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.975126 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" containerName="dnsmasq-dns" Oct 03 13:11:53 crc kubenswrapper[4962]: E1003 13:11:53.975138 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" containerName="init" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.975145 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" containerName="init" Oct 03 13:11:53 crc kubenswrapper[4962]: E1003 13:11:53.975153 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" containerName="init" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.975159 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" containerName="init" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.975326 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" containerName="dnsmasq-dns" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.975340 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" containerName="dnsmasq-dns" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.979908 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 13:11:53 crc kubenswrapper[4962]: I1003 13:11:53.994703 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.009944 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-config\") pod \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.010128 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-dns-svc\") pod \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.010176 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpjbp\" (UniqueName: \"kubernetes.io/projected/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-kube-api-access-dpjbp\") pod \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\" (UID: \"f9733291-12c0-4d8d-9bd4-66b98b55b3ed\") " Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.011244 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.011335 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.011469 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.011626 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-fqfjj" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.016059 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-kube-api-access-dpjbp" (OuterVolumeSpecName: "kube-api-access-dpjbp") pod "f9733291-12c0-4d8d-9bd4-66b98b55b3ed" (UID: "f9733291-12c0-4d8d-9bd4-66b98b55b3ed"). InnerVolumeSpecName "kube-api-access-dpjbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:11:54 crc kubenswrapper[4962]: W1003 13:11:54.061074 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded812fda_dbe5_48ff_bea4_3795a87d716f.slice/crio-42352cc4ebf550ac11dddb148b8cec0f43011b77402fd829ad69da3b4d0693e2 WatchSource:0}: Error finding container 42352cc4ebf550ac11dddb148b8cec0f43011b77402fd829ad69da3b4d0693e2: Status 404 returned error can't find the container with id 42352cc4ebf550ac11dddb148b8cec0f43011b77402fd829ad69da3b4d0693e2 Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.070517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-config" (OuterVolumeSpecName: "config") pod "f9733291-12c0-4d8d-9bd4-66b98b55b3ed" (UID: "f9733291-12c0-4d8d-9bd4-66b98b55b3ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.087787 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8rz95"] Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.094060 4962 generic.go:334] "Generic (PLEG): container finished" podID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" containerID="1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d" exitCode=0 Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.094154 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.094157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" event={"ID":"a5fef104-2524-4ce3-9404-b4bbbdf139bd","Type":"ContainerDied","Data":"1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.094206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mzgnf" event={"ID":"a5fef104-2524-4ce3-9404-b4bbbdf139bd","Type":"ContainerDied","Data":"990e67fb0bb4d1cd445a48ae63f401ded0b315a39bbe6a427895272ba2e1869c"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.094230 4962 scope.go:117] "RemoveContainer" containerID="1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.097340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" event={"ID":"688c3104-6239-4ef4-923d-b169172a07e3","Type":"ContainerStarted","Data":"da5d04b91cdb3e00efa1d083e3d63d40e8d4d46f0902ca33a864f482d9e6f956"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.099400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" event={"ID":"cde9ffcd-ddb9-4114-bf06-11c230347438","Type":"ContainerStarted","Data":"bcc8dc7b753a565c977ea5497aef21bf93e9f2d5fc23000366773f866cc6e1db"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.101198 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9h8s9" event={"ID":"67b77bc9-27ae-4994-86c2-614e48ad33c6","Type":"ContainerStarted","Data":"2733a21e59053a9fe777da7da96a6b1c13acb88fa95f66b9a9cc3889e027399a"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.101673 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9h8s9" event={"ID":"67b77bc9-27ae-4994-86c2-614e48ad33c6","Type":"ContainerStarted","Data":"af26d22b1028d0209d5f2b95d69442c9d5417dbdbc47dd91ddd4362dee014076"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.107555 4962 generic.go:334] "Generic (PLEG): container finished" podID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" containerID="5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338" exitCode=0 Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.107630 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" event={"ID":"f9733291-12c0-4d8d-9bd4-66b98b55b3ed","Type":"ContainerDied","Data":"5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.107683 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" event={"ID":"f9733291-12c0-4d8d-9bd4-66b98b55b3ed","Type":"ContainerDied","Data":"d3ead2e630d8c46d6290b28b24ff945b0a1a8b20cca32fd89363bfb55788d8b4"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.107768 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kfh8n" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.109748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9733291-12c0-4d8d-9bd4-66b98b55b3ed" (UID: "f9733291-12c0-4d8d-9bd4-66b98b55b3ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.111021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695","Type":"ContainerStarted","Data":"d084860c719d4d830c5f8eaa5e004185d65925304682ec6cc942fe2e264fc4f9"} Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.112941 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.113045 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-cache\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.113110 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-lock\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.113171 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.113222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh65s\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-kube-api-access-vh65s\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.113289 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.113306 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpjbp\" (UniqueName: \"kubernetes.io/projected/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-kube-api-access-dpjbp\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.113321 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9733291-12c0-4d8d-9bd4-66b98b55b3ed-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.123793 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9h8s9" podStartSLOduration=2.123773984 podStartE2EDuration="2.123773984s" podCreationTimestamp="2025-10-03 13:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:11:54.119792317 +0000 UTC m=+1322.523690152" watchObservedRunningTime="2025-10-03 13:11:54.123773984 +0000 UTC m=+1322.527671819" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.132553 4962 scope.go:117] "RemoveContainer" containerID="f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.140857 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mzgnf"] Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.147194 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mzgnf"] Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.173201 4962 scope.go:117] "RemoveContainer" containerID="1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d" Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.174700 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d\": container with ID starting with 1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d not found: ID does not exist" containerID="1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.174735 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d"} err="failed to get container status \"1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d\": rpc error: code = NotFound desc = could not find container \"1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d\": container with ID starting with 1eac4b6e086b001f57628da9764b27e29f94995f0ab101f2246e0c6f73bdd76d not found: ID does not exist" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.174757 4962 scope.go:117] "RemoveContainer" containerID="f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047" Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.175207 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047\": container with ID starting with f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047 not found: ID does not exist" containerID="f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.175251 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047"} err="failed to get container status \"f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047\": rpc error: code = NotFound desc = could not find container \"f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047\": container with ID starting with f20e7ffa63f452cd42edd332c4abf4b978045b34475cf44ea9b6cf173113f047 not found: ID does not exist" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.175413 4962 scope.go:117] "RemoveContainer" containerID="5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.214996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.215388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-cache\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.215485 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-lock\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.215526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.215668 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh65s\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-kube-api-access-vh65s\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.215974 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.216313 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.216340 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.216376 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift podName:b4b582ce-b618-4911-b554-f5cae9bcee91 nodeName:}" failed. No retries permitted until 2025-10-03 13:11:54.716361417 +0000 UTC m=+1323.120259252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift") pod "swift-storage-0" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91") : configmap "swift-ring-files" not found Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.216939 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-lock\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.217390 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-cache\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.237536 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh65s\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-kube-api-access-vh65s\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.241083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.243417 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5fef104-2524-4ce3-9404-b4bbbdf139bd" path="/var/lib/kubelet/pods/a5fef104-2524-4ce3-9404-b4bbbdf139bd/volumes" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.299836 4962 scope.go:117] "RemoveContainer" containerID="017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.333365 4962 scope.go:117] "RemoveContainer" containerID="5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338" Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.334909 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338\": container with ID starting with 5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338 not found: ID does not exist" containerID="5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.334965 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338"} err="failed to get container status \"5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338\": rpc error: code = NotFound desc = could not find container \"5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338\": container with ID starting with 5e2d60f774b4e55c3f7eb0040c9616b0ebd29d3f36ee2922c3cbd19ddc0c9338 not found: ID does not exist" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.334999 4962 scope.go:117] "RemoveContainer" containerID="017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37" Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.335386 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37\": container with ID starting with 017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37 not found: ID does not exist" containerID="017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.335413 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37"} err="failed to get container status \"017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37\": rpc error: code = NotFound desc = could not find container \"017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37\": container with ID starting with 017758015dead298005e3bebf5264174708251db7e324fc90ff5a0fbd3867e37 not found: ID does not exist" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.450133 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kfh8n"] Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.454616 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kfh8n"] Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.510853 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-hrx4m"] Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.512073 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.514075 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.514252 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.514354 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.529055 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hrx4m"] Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.621364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-combined-ca-bundle\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.621416 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxqg\" (UniqueName: \"kubernetes.io/projected/62ffa74e-e3cd-4a30-8197-12bd37d64a65-kube-api-access-vsxqg\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.621513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-dispersionconf\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.621541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62ffa74e-e3cd-4a30-8197-12bd37d64a65-etc-swift\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.621608 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-swiftconf\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.621678 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-scripts\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.621703 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-ring-data-devices\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.723979 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-scripts\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.724285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-ring-data-devices\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.724308 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-combined-ca-bundle\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.724327 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxqg\" (UniqueName: \"kubernetes.io/projected/62ffa74e-e3cd-4a30-8197-12bd37d64a65-kube-api-access-vsxqg\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.724378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.724405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-dispersionconf\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.724432 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62ffa74e-e3cd-4a30-8197-12bd37d64a65-etc-swift\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.724528 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-swiftconf\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.727461 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62ffa74e-e3cd-4a30-8197-12bd37d64a65-etc-swift\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.727602 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-ring-data-devices\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.727709 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.727728 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 13:11:54 crc kubenswrapper[4962]: E1003 13:11:54.727798 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift podName:b4b582ce-b618-4911-b554-f5cae9bcee91 nodeName:}" failed. No retries permitted until 2025-10-03 13:11:55.727779355 +0000 UTC m=+1324.131677280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift") pod "swift-storage-0" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91") : configmap "swift-ring-files" not found Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.728269 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-scripts\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.729755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-dispersionconf\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.732144 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-swiftconf\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.736412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-combined-ca-bundle\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.744017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxqg\" (UniqueName: \"kubernetes.io/projected/62ffa74e-e3cd-4a30-8197-12bd37d64a65-kube-api-access-vsxqg\") pod \"swift-ring-rebalance-hrx4m\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:54 crc kubenswrapper[4962]: I1003 13:11:54.826496 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.134764 4962 generic.go:334] "Generic (PLEG): container finished" podID="688c3104-6239-4ef4-923d-b169172a07e3" containerID="ebf40c2e391803709f8ef085ec16a8dcfc3c63961639c5af7521d688e1853bd0" exitCode=0 Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.135081 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" event={"ID":"688c3104-6239-4ef4-923d-b169172a07e3","Type":"ContainerDied","Data":"ebf40c2e391803709f8ef085ec16a8dcfc3c63961639c5af7521d688e1853bd0"} Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.145339 4962 generic.go:334] "Generic (PLEG): container finished" podID="cde9ffcd-ddb9-4114-bf06-11c230347438" containerID="1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b" exitCode=0 Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.145499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" event={"ID":"cde9ffcd-ddb9-4114-bf06-11c230347438","Type":"ContainerDied","Data":"1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b"} Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.159202 4962 generic.go:334] "Generic (PLEG): container finished" podID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerID="678ed25fe26650549b04412d89b2a8a75f57962aa6c88529e9fc7bc5487bf4b1" exitCode=0 Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.160179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8rz95" event={"ID":"ed812fda-dbe5-48ff-bea4-3795a87d716f","Type":"ContainerDied","Data":"678ed25fe26650549b04412d89b2a8a75f57962aa6c88529e9fc7bc5487bf4b1"} Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.160319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8rz95" event={"ID":"ed812fda-dbe5-48ff-bea4-3795a87d716f","Type":"ContainerStarted","Data":"42352cc4ebf550ac11dddb148b8cec0f43011b77402fd829ad69da3b4d0693e2"} Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.320061 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hrx4m"] Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.754080 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:55 crc kubenswrapper[4962]: E1003 13:11:55.754259 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 13:11:55 crc kubenswrapper[4962]: E1003 13:11:55.754497 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 13:11:55 crc kubenswrapper[4962]: E1003 13:11:55.754563 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift podName:b4b582ce-b618-4911-b554-f5cae9bcee91 nodeName:}" failed. No retries permitted until 2025-10-03 13:11:57.754540407 +0000 UTC m=+1326.158438242 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift") pod "swift-storage-0" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91") : configmap "swift-ring-files" not found Oct 03 13:11:55 crc kubenswrapper[4962]: I1003 13:11:55.953851 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.057919 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-ovsdbserver-sb\") pod \"688c3104-6239-4ef4-923d-b169172a07e3\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.057977 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j4kl\" (UniqueName: \"kubernetes.io/projected/688c3104-6239-4ef4-923d-b169172a07e3-kube-api-access-4j4kl\") pod \"688c3104-6239-4ef4-923d-b169172a07e3\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.058073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-config\") pod \"688c3104-6239-4ef4-923d-b169172a07e3\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.058202 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-dns-svc\") pod \"688c3104-6239-4ef4-923d-b169172a07e3\" (UID: \"688c3104-6239-4ef4-923d-b169172a07e3\") " Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.063342 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688c3104-6239-4ef4-923d-b169172a07e3-kube-api-access-4j4kl" (OuterVolumeSpecName: "kube-api-access-4j4kl") pod "688c3104-6239-4ef4-923d-b169172a07e3" (UID: "688c3104-6239-4ef4-923d-b169172a07e3"). InnerVolumeSpecName "kube-api-access-4j4kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.088091 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "688c3104-6239-4ef4-923d-b169172a07e3" (UID: "688c3104-6239-4ef4-923d-b169172a07e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.088286 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-config" (OuterVolumeSpecName: "config") pod "688c3104-6239-4ef4-923d-b169172a07e3" (UID: "688c3104-6239-4ef4-923d-b169172a07e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.088676 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "688c3104-6239-4ef4-923d-b169172a07e3" (UID: "688c3104-6239-4ef4-923d-b169172a07e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.159930 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.162403 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.162635 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688c3104-6239-4ef4-923d-b169172a07e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.162784 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j4kl\" (UniqueName: \"kubernetes.io/projected/688c3104-6239-4ef4-923d-b169172a07e3-kube-api-access-4j4kl\") on node \"crc\" DevicePath \"\"" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.181176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hrx4m" event={"ID":"62ffa74e-e3cd-4a30-8197-12bd37d64a65","Type":"ContainerStarted","Data":"8f299f4ad3f2215da6c8c133db06c550efa8fdf60db1ce05fb529a97f0f64314"} Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.185624 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.185646 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8rz95" event={"ID":"ed812fda-dbe5-48ff-bea4-3795a87d716f","Type":"ContainerStarted","Data":"1386cbaa0e17306a368bb4a2806672e2c52aaea5f063190c3cad125a74ffa7fc"} Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.189178 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.189167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-l2mcn" event={"ID":"688c3104-6239-4ef4-923d-b169172a07e3","Type":"ContainerDied","Data":"da5d04b91cdb3e00efa1d083e3d63d40e8d4d46f0902ca33a864f482d9e6f956"} Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.189504 4962 scope.go:117] "RemoveContainer" containerID="ebf40c2e391803709f8ef085ec16a8dcfc3c63961639c5af7521d688e1853bd0" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.193311 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" event={"ID":"cde9ffcd-ddb9-4114-bf06-11c230347438","Type":"ContainerStarted","Data":"b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb"} Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.194255 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.204526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695","Type":"ContainerStarted","Data":"d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005"} Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.206005 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-8rz95" podStartSLOduration=4.20598406 podStartE2EDuration="4.20598406s" podCreationTimestamp="2025-10-03 13:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:11:56.204356367 +0000 UTC m=+1324.608254202" watchObservedRunningTime="2025-10-03 13:11:56.20598406 +0000 UTC m=+1324.609881905" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.230812 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" podStartSLOduration=4.230769278 podStartE2EDuration="4.230769278s" podCreationTimestamp="2025-10-03 13:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:11:56.224690614 +0000 UTC m=+1324.628588449" watchObservedRunningTime="2025-10-03 13:11:56.230769278 +0000 UTC m=+1324.634667113" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.280277 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9733291-12c0-4d8d-9bd4-66b98b55b3ed" path="/var/lib/kubelet/pods/f9733291-12c0-4d8d-9bd4-66b98b55b3ed/volumes" Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.287896 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-l2mcn"] Oct 03 13:11:56 crc kubenswrapper[4962]: I1003 13:11:56.287941 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-l2mcn"] Oct 03 13:11:57 crc kubenswrapper[4962]: I1003 13:11:57.215220 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695","Type":"ContainerStarted","Data":"c10da2ac06df5b8b854f495ec36dfbffd1281d8e886e7d01348cf8b99da08700"} Oct 03 13:11:57 crc kubenswrapper[4962]: I1003 13:11:57.215721 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 13:11:57 crc kubenswrapper[4962]: I1003 13:11:57.234050 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.254830935 podStartE2EDuration="5.234030057s" podCreationTimestamp="2025-10-03 13:11:52 +0000 UTC" firstStartedPulling="2025-10-03 13:11:53.851052353 +0000 UTC m=+1322.254950188" lastFinishedPulling="2025-10-03 13:11:55.830251475 +0000 UTC m=+1324.234149310" observedRunningTime="2025-10-03 13:11:57.230533323 +0000 UTC m=+1325.634431158" watchObservedRunningTime="2025-10-03 13:11:57.234030057 +0000 UTC m=+1325.637927892" Oct 03 13:11:57 crc kubenswrapper[4962]: I1003 13:11:57.791695 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:11:57 crc kubenswrapper[4962]: E1003 13:11:57.791885 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 13:11:57 crc kubenswrapper[4962]: E1003 13:11:57.791911 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 13:11:57 crc kubenswrapper[4962]: E1003 13:11:57.791960 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift podName:b4b582ce-b618-4911-b554-f5cae9bcee91 nodeName:}" failed. No retries permitted until 2025-10-03 13:12:01.791943206 +0000 UTC m=+1330.195841041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift") pod "swift-storage-0" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91") : configmap "swift-ring-files" not found Oct 03 13:11:58 crc kubenswrapper[4962]: I1003 13:11:58.239201 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688c3104-6239-4ef4-923d-b169172a07e3" path="/var/lib/kubelet/pods/688c3104-6239-4ef4-923d-b169172a07e3/volumes" Oct 03 13:11:59 crc kubenswrapper[4962]: I1003 13:11:59.232157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hrx4m" event={"ID":"62ffa74e-e3cd-4a30-8197-12bd37d64a65","Type":"ContainerStarted","Data":"42eff25118b1d9333ccb5e1e1095736fe7ad68812468237167cf44bfffda55aa"} Oct 03 13:11:59 crc kubenswrapper[4962]: I1003 13:11:59.252855 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-hrx4m" podStartSLOduration=2.0596912019999998 podStartE2EDuration="5.252834905s" podCreationTimestamp="2025-10-03 13:11:54 +0000 UTC" firstStartedPulling="2025-10-03 13:11:55.814947703 +0000 UTC m=+1324.218845538" lastFinishedPulling="2025-10-03 13:11:59.008091406 +0000 UTC m=+1327.411989241" observedRunningTime="2025-10-03 13:11:59.246253758 +0000 UTC m=+1327.650151603" watchObservedRunningTime="2025-10-03 13:11:59.252834905 +0000 UTC m=+1327.656732760" Oct 03 13:11:59 crc kubenswrapper[4962]: I1003 13:11:59.991804 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 13:11:59 crc kubenswrapper[4962]: I1003 13:11:59.992715 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.046621 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.282153 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.767250 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.767572 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.814169 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hxdfm"] Oct 03 13:12:00 crc kubenswrapper[4962]: E1003 13:12:00.814552 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688c3104-6239-4ef4-923d-b169172a07e3" containerName="init" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.814576 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="688c3104-6239-4ef4-923d-b169172a07e3" containerName="init" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.814814 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="688c3104-6239-4ef4-923d-b169172a07e3" containerName="init" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.815450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hxdfm" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.825178 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hxdfm"] Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.840268 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 13:12:00 crc kubenswrapper[4962]: I1003 13:12:00.945796 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhxt\" (UniqueName: \"kubernetes.io/projected/b3133c1f-476d-440b-977a-9642e1284622-kube-api-access-vwhxt\") pod \"keystone-db-create-hxdfm\" (UID: \"b3133c1f-476d-440b-977a-9642e1284622\") " pod="openstack/keystone-db-create-hxdfm" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.014094 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-shsr9"] Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.015137 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-shsr9" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.021853 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-shsr9"] Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.047783 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhxt\" (UniqueName: \"kubernetes.io/projected/b3133c1f-476d-440b-977a-9642e1284622-kube-api-access-vwhxt\") pod \"keystone-db-create-hxdfm\" (UID: \"b3133c1f-476d-440b-977a-9642e1284622\") " pod="openstack/keystone-db-create-hxdfm" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.067797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhxt\" (UniqueName: \"kubernetes.io/projected/b3133c1f-476d-440b-977a-9642e1284622-kube-api-access-vwhxt\") pod \"keystone-db-create-hxdfm\" (UID: \"b3133c1f-476d-440b-977a-9642e1284622\") " pod="openstack/keystone-db-create-hxdfm" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.133431 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hxdfm" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.151712 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgw2k\" (UniqueName: \"kubernetes.io/projected/e19e07aa-3097-4c52-990e-c2738163f946-kube-api-access-mgw2k\") pod \"placement-db-create-shsr9\" (UID: \"e19e07aa-3097-4c52-990e-c2738163f946\") " pod="openstack/placement-db-create-shsr9" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.253013 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgw2k\" (UniqueName: \"kubernetes.io/projected/e19e07aa-3097-4c52-990e-c2738163f946-kube-api-access-mgw2k\") pod \"placement-db-create-shsr9\" (UID: \"e19e07aa-3097-4c52-990e-c2738163f946\") " pod="openstack/placement-db-create-shsr9" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.277697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgw2k\" (UniqueName: \"kubernetes.io/projected/e19e07aa-3097-4c52-990e-c2738163f946-kube-api-access-mgw2k\") pod \"placement-db-create-shsr9\" (UID: \"e19e07aa-3097-4c52-990e-c2738163f946\") " pod="openstack/placement-db-create-shsr9" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.323951 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.333432 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-shsr9" Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.561663 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hxdfm"] Oct 03 13:12:01 crc kubenswrapper[4962]: W1003 13:12:01.574894 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3133c1f_476d_440b_977a_9642e1284622.slice/crio-268722e9027003d8909a021fa85020b31aef7dbf6bfa2388684ddf0eae9186d0 WatchSource:0}: Error finding container 268722e9027003d8909a021fa85020b31aef7dbf6bfa2388684ddf0eae9186d0: Status 404 returned error can't find the container with id 268722e9027003d8909a021fa85020b31aef7dbf6bfa2388684ddf0eae9186d0 Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.788203 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-shsr9"] Oct 03 13:12:01 crc kubenswrapper[4962]: W1003 13:12:01.828817 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19e07aa_3097_4c52_990e_c2738163f946.slice/crio-cf5f8ec060ec6d5c4d7814929b032cab8686b6f17b55c8d093cec3c1ec9d56ea WatchSource:0}: Error finding container cf5f8ec060ec6d5c4d7814929b032cab8686b6f17b55c8d093cec3c1ec9d56ea: Status 404 returned error can't find the container with id cf5f8ec060ec6d5c4d7814929b032cab8686b6f17b55c8d093cec3c1ec9d56ea Oct 03 13:12:01 crc kubenswrapper[4962]: I1003 13:12:01.865006 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:12:01 crc kubenswrapper[4962]: E1003 13:12:01.865175 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 13:12:01 crc kubenswrapper[4962]: E1003 13:12:01.865194 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 13:12:01 crc kubenswrapper[4962]: E1003 13:12:01.865242 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift podName:b4b582ce-b618-4911-b554-f5cae9bcee91 nodeName:}" failed. No retries permitted until 2025-10-03 13:12:09.865228014 +0000 UTC m=+1338.269125849 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift") pod "swift-storage-0" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91") : configmap "swift-ring-files" not found Oct 03 13:12:02 crc kubenswrapper[4962]: I1003 13:12:02.262877 4962 generic.go:334] "Generic (PLEG): container finished" podID="b3133c1f-476d-440b-977a-9642e1284622" containerID="184bb0414d89e05cc91628a858197e7943eee905467d226d4d1a357a8b8a1746" exitCode=0 Oct 03 13:12:02 crc kubenswrapper[4962]: I1003 13:12:02.262980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hxdfm" event={"ID":"b3133c1f-476d-440b-977a-9642e1284622","Type":"ContainerDied","Data":"184bb0414d89e05cc91628a858197e7943eee905467d226d4d1a357a8b8a1746"} Oct 03 13:12:02 crc kubenswrapper[4962]: I1003 13:12:02.263836 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hxdfm" event={"ID":"b3133c1f-476d-440b-977a-9642e1284622","Type":"ContainerStarted","Data":"268722e9027003d8909a021fa85020b31aef7dbf6bfa2388684ddf0eae9186d0"} Oct 03 13:12:02 crc kubenswrapper[4962]: I1003 13:12:02.265436 4962 generic.go:334] "Generic (PLEG): container finished" podID="e19e07aa-3097-4c52-990e-c2738163f946" containerID="a7cacdfd728f00a14188df7d8385721ed6372f29aee704ba66954115ef4f7644" exitCode=0 Oct 03 13:12:02 crc kubenswrapper[4962]: I1003 13:12:02.265461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-shsr9" event={"ID":"e19e07aa-3097-4c52-990e-c2738163f946","Type":"ContainerDied","Data":"a7cacdfd728f00a14188df7d8385721ed6372f29aee704ba66954115ef4f7644"} Oct 03 13:12:02 crc kubenswrapper[4962]: I1003 13:12:02.265497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-shsr9" event={"ID":"e19e07aa-3097-4c52-990e-c2738163f946","Type":"ContainerStarted","Data":"cf5f8ec060ec6d5c4d7814929b032cab8686b6f17b55c8d093cec3c1ec9d56ea"} Oct 03 13:12:02 crc kubenswrapper[4962]: I1003 13:12:02.929871 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.248865 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.305522 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-mwsmg"] Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.305779 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" podUID="cde9ffcd-ddb9-4114-bf06-11c230347438" containerName="dnsmasq-dns" containerID="cri-o://b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb" gracePeriod=10 Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.683835 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hxdfm" Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.795205 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwhxt\" (UniqueName: \"kubernetes.io/projected/b3133c1f-476d-440b-977a-9642e1284622-kube-api-access-vwhxt\") pod \"b3133c1f-476d-440b-977a-9642e1284622\" (UID: \"b3133c1f-476d-440b-977a-9642e1284622\") " Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.802883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3133c1f-476d-440b-977a-9642e1284622-kube-api-access-vwhxt" (OuterVolumeSpecName: "kube-api-access-vwhxt") pod "b3133c1f-476d-440b-977a-9642e1284622" (UID: "b3133c1f-476d-440b-977a-9642e1284622"). InnerVolumeSpecName "kube-api-access-vwhxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.860550 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-shsr9" Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.868329 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.897889 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwhxt\" (UniqueName: \"kubernetes.io/projected/b3133c1f-476d-440b-977a-9642e1284622-kube-api-access-vwhxt\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.998917 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgw2k\" (UniqueName: \"kubernetes.io/projected/e19e07aa-3097-4c52-990e-c2738163f946-kube-api-access-mgw2k\") pod \"e19e07aa-3097-4c52-990e-c2738163f946\" (UID: \"e19e07aa-3097-4c52-990e-c2738163f946\") " Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.999001 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-sb\") pod \"cde9ffcd-ddb9-4114-bf06-11c230347438\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.999096 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dphb\" (UniqueName: \"kubernetes.io/projected/cde9ffcd-ddb9-4114-bf06-11c230347438-kube-api-access-9dphb\") pod \"cde9ffcd-ddb9-4114-bf06-11c230347438\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.999121 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-config\") pod \"cde9ffcd-ddb9-4114-bf06-11c230347438\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.999156 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-nb\") pod \"cde9ffcd-ddb9-4114-bf06-11c230347438\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " Oct 03 13:12:03 crc kubenswrapper[4962]: I1003 13:12:03.999196 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-dns-svc\") pod \"cde9ffcd-ddb9-4114-bf06-11c230347438\" (UID: \"cde9ffcd-ddb9-4114-bf06-11c230347438\") " Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.005901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19e07aa-3097-4c52-990e-c2738163f946-kube-api-access-mgw2k" (OuterVolumeSpecName: "kube-api-access-mgw2k") pod "e19e07aa-3097-4c52-990e-c2738163f946" (UID: "e19e07aa-3097-4c52-990e-c2738163f946"). InnerVolumeSpecName "kube-api-access-mgw2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.005968 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde9ffcd-ddb9-4114-bf06-11c230347438-kube-api-access-9dphb" (OuterVolumeSpecName: "kube-api-access-9dphb") pod "cde9ffcd-ddb9-4114-bf06-11c230347438" (UID: "cde9ffcd-ddb9-4114-bf06-11c230347438"). InnerVolumeSpecName "kube-api-access-9dphb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.036247 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cde9ffcd-ddb9-4114-bf06-11c230347438" (UID: "cde9ffcd-ddb9-4114-bf06-11c230347438"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.037343 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cde9ffcd-ddb9-4114-bf06-11c230347438" (UID: "cde9ffcd-ddb9-4114-bf06-11c230347438"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.037615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cde9ffcd-ddb9-4114-bf06-11c230347438" (UID: "cde9ffcd-ddb9-4114-bf06-11c230347438"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.043569 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-config" (OuterVolumeSpecName: "config") pod "cde9ffcd-ddb9-4114-bf06-11c230347438" (UID: "cde9ffcd-ddb9-4114-bf06-11c230347438"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.101328 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgw2k\" (UniqueName: \"kubernetes.io/projected/e19e07aa-3097-4c52-990e-c2738163f946-kube-api-access-mgw2k\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.101367 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.101379 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dphb\" (UniqueName: \"kubernetes.io/projected/cde9ffcd-ddb9-4114-bf06-11c230347438-kube-api-access-9dphb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.101392 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.101403 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.101413 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cde9ffcd-ddb9-4114-bf06-11c230347438-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.280347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hxdfm" event={"ID":"b3133c1f-476d-440b-977a-9642e1284622","Type":"ContainerDied","Data":"268722e9027003d8909a021fa85020b31aef7dbf6bfa2388684ddf0eae9186d0"} Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.280743 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="268722e9027003d8909a021fa85020b31aef7dbf6bfa2388684ddf0eae9186d0" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.280370 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hxdfm" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.281987 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-shsr9" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.282362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-shsr9" event={"ID":"e19e07aa-3097-4c52-990e-c2738163f946","Type":"ContainerDied","Data":"cf5f8ec060ec6d5c4d7814929b032cab8686b6f17b55c8d093cec3c1ec9d56ea"} Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.282390 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf5f8ec060ec6d5c4d7814929b032cab8686b6f17b55c8d093cec3c1ec9d56ea" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.284502 4962 generic.go:334] "Generic (PLEG): container finished" podID="cde9ffcd-ddb9-4114-bf06-11c230347438" containerID="b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb" exitCode=0 Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.284537 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" event={"ID":"cde9ffcd-ddb9-4114-bf06-11c230347438","Type":"ContainerDied","Data":"b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb"} Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.284558 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" event={"ID":"cde9ffcd-ddb9-4114-bf06-11c230347438","Type":"ContainerDied","Data":"bcc8dc7b753a565c977ea5497aef21bf93e9f2d5fc23000366773f866cc6e1db"} Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.284565 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-mwsmg" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.284580 4962 scope.go:117] "RemoveContainer" containerID="b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.309290 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-mwsmg"] Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.309535 4962 scope.go:117] "RemoveContainer" containerID="1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.326171 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-mwsmg"] Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.345439 4962 scope.go:117] "RemoveContainer" containerID="b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb" Oct 03 13:12:04 crc kubenswrapper[4962]: E1003 13:12:04.348887 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb\": container with ID starting with b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb not found: ID does not exist" containerID="b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.348927 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb"} err="failed to get container status \"b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb\": rpc error: code = NotFound desc = could not find container \"b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb\": container with ID starting with b0ca2c2fb1a7db8699c423d903a4277ecf3c9d81a69c788a9ee1dbf7cf3d7beb not found: ID does not exist" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.348950 4962 scope.go:117] "RemoveContainer" containerID="1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b" Oct 03 13:12:04 crc kubenswrapper[4962]: E1003 13:12:04.349395 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b\": container with ID starting with 1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b not found: ID does not exist" containerID="1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b" Oct 03 13:12:04 crc kubenswrapper[4962]: I1003 13:12:04.349426 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b"} err="failed to get container status \"1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b\": rpc error: code = NotFound desc = could not find container \"1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b\": container with ID starting with 1dd2942921fc6aea50b932d53c1897299f60435199561a2db1c10e41d887b98b not found: ID does not exist" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.246740 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde9ffcd-ddb9-4114-bf06-11c230347438" path="/var/lib/kubelet/pods/cde9ffcd-ddb9-4114-bf06-11c230347438/volumes" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.254607 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-htqpm"] Oct 03 13:12:06 crc kubenswrapper[4962]: E1003 13:12:06.255223 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde9ffcd-ddb9-4114-bf06-11c230347438" containerName="init" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.255247 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde9ffcd-ddb9-4114-bf06-11c230347438" containerName="init" Oct 03 13:12:06 crc kubenswrapper[4962]: E1003 13:12:06.255269 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19e07aa-3097-4c52-990e-c2738163f946" containerName="mariadb-database-create" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.255279 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19e07aa-3097-4c52-990e-c2738163f946" containerName="mariadb-database-create" Oct 03 13:12:06 crc kubenswrapper[4962]: E1003 13:12:06.255298 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3133c1f-476d-440b-977a-9642e1284622" containerName="mariadb-database-create" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.255306 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3133c1f-476d-440b-977a-9642e1284622" containerName="mariadb-database-create" Oct 03 13:12:06 crc kubenswrapper[4962]: E1003 13:12:06.255329 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde9ffcd-ddb9-4114-bf06-11c230347438" containerName="dnsmasq-dns" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.255336 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde9ffcd-ddb9-4114-bf06-11c230347438" containerName="dnsmasq-dns" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.255552 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3133c1f-476d-440b-977a-9642e1284622" containerName="mariadb-database-create" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.255576 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19e07aa-3097-4c52-990e-c2738163f946" containerName="mariadb-database-create" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.255596 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde9ffcd-ddb9-4114-bf06-11c230347438" containerName="dnsmasq-dns" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.257020 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-htqpm" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.263985 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-htqpm"] Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.304970 4962 generic.go:334] "Generic (PLEG): container finished" podID="62ffa74e-e3cd-4a30-8197-12bd37d64a65" containerID="42eff25118b1d9333ccb5e1e1095736fe7ad68812468237167cf44bfffda55aa" exitCode=0 Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.305011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hrx4m" event={"ID":"62ffa74e-e3cd-4a30-8197-12bd37d64a65","Type":"ContainerDied","Data":"42eff25118b1d9333ccb5e1e1095736fe7ad68812468237167cf44bfffda55aa"} Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.350461 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8lwg\" (UniqueName: \"kubernetes.io/projected/836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5-kube-api-access-b8lwg\") pod \"glance-db-create-htqpm\" (UID: \"836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5\") " pod="openstack/glance-db-create-htqpm" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.451749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8lwg\" (UniqueName: \"kubernetes.io/projected/836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5-kube-api-access-b8lwg\") pod \"glance-db-create-htqpm\" (UID: \"836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5\") " pod="openstack/glance-db-create-htqpm" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.471118 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8lwg\" (UniqueName: \"kubernetes.io/projected/836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5-kube-api-access-b8lwg\") pod \"glance-db-create-htqpm\" (UID: \"836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5\") " pod="openstack/glance-db-create-htqpm" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.577351 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-htqpm" Oct 03 13:12:06 crc kubenswrapper[4962]: I1003 13:12:06.999007 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-htqpm"] Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.316428 4962 generic.go:334] "Generic (PLEG): container finished" podID="836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5" containerID="77e7361ae6d4f822868e68592560c81c211040a7da222b118b3504c71f5b93c1" exitCode=0 Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.316916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-htqpm" event={"ID":"836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5","Type":"ContainerDied","Data":"77e7361ae6d4f822868e68592560c81c211040a7da222b118b3504c71f5b93c1"} Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.316944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-htqpm" event={"ID":"836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5","Type":"ContainerStarted","Data":"ceb1b61c7d6d41d8a47983a3b45b2dcf64ce399c6eb7245d3385eb0f2e047426"} Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.625593 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.775312 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-combined-ca-bundle\") pod \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.775389 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsxqg\" (UniqueName: \"kubernetes.io/projected/62ffa74e-e3cd-4a30-8197-12bd37d64a65-kube-api-access-vsxqg\") pod \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.775437 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62ffa74e-e3cd-4a30-8197-12bd37d64a65-etc-swift\") pod \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.775483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-scripts\") pod \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.775543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-swiftconf\") pod \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.775562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-dispersionconf\") pod \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.775584 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-ring-data-devices\") pod \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\" (UID: \"62ffa74e-e3cd-4a30-8197-12bd37d64a65\") " Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.776337 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "62ffa74e-e3cd-4a30-8197-12bd37d64a65" (UID: "62ffa74e-e3cd-4a30-8197-12bd37d64a65"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.777151 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62ffa74e-e3cd-4a30-8197-12bd37d64a65-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "62ffa74e-e3cd-4a30-8197-12bd37d64a65" (UID: "62ffa74e-e3cd-4a30-8197-12bd37d64a65"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.792209 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ffa74e-e3cd-4a30-8197-12bd37d64a65-kube-api-access-vsxqg" (OuterVolumeSpecName: "kube-api-access-vsxqg") pod "62ffa74e-e3cd-4a30-8197-12bd37d64a65" (UID: "62ffa74e-e3cd-4a30-8197-12bd37d64a65"). InnerVolumeSpecName "kube-api-access-vsxqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.795163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "62ffa74e-e3cd-4a30-8197-12bd37d64a65" (UID: "62ffa74e-e3cd-4a30-8197-12bd37d64a65"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.799486 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62ffa74e-e3cd-4a30-8197-12bd37d64a65" (UID: "62ffa74e-e3cd-4a30-8197-12bd37d64a65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.813407 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-scripts" (OuterVolumeSpecName: "scripts") pod "62ffa74e-e3cd-4a30-8197-12bd37d64a65" (UID: "62ffa74e-e3cd-4a30-8197-12bd37d64a65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.813928 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "62ffa74e-e3cd-4a30-8197-12bd37d64a65" (UID: "62ffa74e-e3cd-4a30-8197-12bd37d64a65"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.878163 4962 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.878202 4962 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.878215 4962 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.878226 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ffa74e-e3cd-4a30-8197-12bd37d64a65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.878239 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsxqg\" (UniqueName: \"kubernetes.io/projected/62ffa74e-e3cd-4a30-8197-12bd37d64a65-kube-api-access-vsxqg\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.878250 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/62ffa74e-e3cd-4a30-8197-12bd37d64a65-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.878260 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62ffa74e-e3cd-4a30-8197-12bd37d64a65-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:07 crc kubenswrapper[4962]: I1003 13:12:07.958700 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.325517 4962 generic.go:334] "Generic (PLEG): container finished" podID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerID="db6803eb436ec4541ae5c4083b22b4f72e755e3175f11d228ed92e5f3fa9bc0c" exitCode=0 Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.325595 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"862ad9df-af58-4304-9ad5-7faba334e2d9","Type":"ContainerDied","Data":"db6803eb436ec4541ae5c4083b22b4f72e755e3175f11d228ed92e5f3fa9bc0c"} Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.327806 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hrx4m" Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.327885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hrx4m" event={"ID":"62ffa74e-e3cd-4a30-8197-12bd37d64a65","Type":"ContainerDied","Data":"8f299f4ad3f2215da6c8c133db06c550efa8fdf60db1ce05fb529a97f0f64314"} Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.327942 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f299f4ad3f2215da6c8c133db06c550efa8fdf60db1ce05fb529a97f0f64314" Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.329766 4962 generic.go:334] "Generic (PLEG): container finished" podID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerID="d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b" exitCode=0 Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.329859 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"221bdd26-0fec-49e5-86ec-c2aefe7a5902","Type":"ContainerDied","Data":"d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b"} Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.801497 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-htqpm" Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.902118 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8lwg\" (UniqueName: \"kubernetes.io/projected/836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5-kube-api-access-b8lwg\") pod \"836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5\" (UID: \"836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5\") " Oct 03 13:12:08 crc kubenswrapper[4962]: I1003 13:12:08.929223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5-kube-api-access-b8lwg" (OuterVolumeSpecName: "kube-api-access-b8lwg") pod "836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5" (UID: "836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5"). InnerVolumeSpecName "kube-api-access-b8lwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.004992 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8lwg\" (UniqueName: \"kubernetes.io/projected/836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5-kube-api-access-b8lwg\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.340184 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"221bdd26-0fec-49e5-86ec-c2aefe7a5902","Type":"ContainerStarted","Data":"ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493"} Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.340416 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.343751 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-htqpm" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.343757 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-htqpm" event={"ID":"836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5","Type":"ContainerDied","Data":"ceb1b61c7d6d41d8a47983a3b45b2dcf64ce399c6eb7245d3385eb0f2e047426"} Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.343878 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceb1b61c7d6d41d8a47983a3b45b2dcf64ce399c6eb7245d3385eb0f2e047426" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.346179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"862ad9df-af58-4304-9ad5-7faba334e2d9","Type":"ContainerStarted","Data":"ffd50edd39ffcd28008bcc1779cfab62afd89dc665b9322ffc084495ca1c56d2"} Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.346446 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.363452 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.842218384 podStartE2EDuration="53.363435595s" podCreationTimestamp="2025-10-03 13:11:16 +0000 UTC" firstStartedPulling="2025-10-03 13:11:18.826341922 +0000 UTC m=+1287.230239757" lastFinishedPulling="2025-10-03 13:11:34.347559133 +0000 UTC m=+1302.751456968" observedRunningTime="2025-10-03 13:12:09.362288034 +0000 UTC m=+1337.766185889" watchObservedRunningTime="2025-10-03 13:12:09.363435595 +0000 UTC m=+1337.767333430" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.385268 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.630809343 podStartE2EDuration="53.385251313s" podCreationTimestamp="2025-10-03 13:11:16 +0000 UTC" firstStartedPulling="2025-10-03 13:11:18.662109741 +0000 UTC m=+1287.066007576" lastFinishedPulling="2025-10-03 13:11:34.416551711 +0000 UTC m=+1302.820449546" observedRunningTime="2025-10-03 13:12:09.383955468 +0000 UTC m=+1337.787853303" watchObservedRunningTime="2025-10-03 13:12:09.385251313 +0000 UTC m=+1337.789149148" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.919368 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:12:09 crc kubenswrapper[4962]: I1003 13:12:09.928743 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"swift-storage-0\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " pod="openstack/swift-storage-0" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.218139 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.788921 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 13:12:10 crc kubenswrapper[4962]: W1003 13:12:10.793433 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4b582ce_b618_4911_b554_f5cae9bcee91.slice/crio-90a3666d7ec243c20d6f9c6167ca1f8679d60343366a66f8a480d13c588be0eb WatchSource:0}: Error finding container 90a3666d7ec243c20d6f9c6167ca1f8679d60343366a66f8a480d13c588be0eb: Status 404 returned error can't find the container with id 90a3666d7ec243c20d6f9c6167ca1f8679d60343366a66f8a480d13c588be0eb Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.857950 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2734-account-create-kbjj5"] Oct 03 13:12:10 crc kubenswrapper[4962]: E1003 13:12:10.858431 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ffa74e-e3cd-4a30-8197-12bd37d64a65" containerName="swift-ring-rebalance" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.858452 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ffa74e-e3cd-4a30-8197-12bd37d64a65" containerName="swift-ring-rebalance" Oct 03 13:12:10 crc kubenswrapper[4962]: E1003 13:12:10.858490 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5" containerName="mariadb-database-create" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.858498 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5" containerName="mariadb-database-create" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.858689 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ffa74e-e3cd-4a30-8197-12bd37d64a65" containerName="swift-ring-rebalance" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.858717 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5" containerName="mariadb-database-create" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.859441 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2734-account-create-kbjj5" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.862059 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.868926 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2734-account-create-kbjj5"] Oct 03 13:12:10 crc kubenswrapper[4962]: I1003 13:12:10.936183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zwd5\" (UniqueName: \"kubernetes.io/projected/6bb7b0d2-9599-4942-90e3-5bfb712724e0-kube-api-access-2zwd5\") pod \"keystone-2734-account-create-kbjj5\" (UID: \"6bb7b0d2-9599-4942-90e3-5bfb712724e0\") " pod="openstack/keystone-2734-account-create-kbjj5" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.038394 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zwd5\" (UniqueName: \"kubernetes.io/projected/6bb7b0d2-9599-4942-90e3-5bfb712724e0-kube-api-access-2zwd5\") pod \"keystone-2734-account-create-kbjj5\" (UID: \"6bb7b0d2-9599-4942-90e3-5bfb712724e0\") " pod="openstack/keystone-2734-account-create-kbjj5" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.048577 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-453e-account-create-krqfz"] Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.049564 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-453e-account-create-krqfz" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.051390 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.060175 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-453e-account-create-krqfz"] Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.078070 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zwd5\" (UniqueName: \"kubernetes.io/projected/6bb7b0d2-9599-4942-90e3-5bfb712724e0-kube-api-access-2zwd5\") pod \"keystone-2734-account-create-kbjj5\" (UID: \"6bb7b0d2-9599-4942-90e3-5bfb712724e0\") " pod="openstack/keystone-2734-account-create-kbjj5" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.139626 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6cnh\" (UniqueName: \"kubernetes.io/projected/8d28696d-ae2e-427d-b247-6ef22dde492f-kube-api-access-w6cnh\") pod \"placement-453e-account-create-krqfz\" (UID: \"8d28696d-ae2e-427d-b247-6ef22dde492f\") " pod="openstack/placement-453e-account-create-krqfz" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.179550 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2734-account-create-kbjj5" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.241292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cnh\" (UniqueName: \"kubernetes.io/projected/8d28696d-ae2e-427d-b247-6ef22dde492f-kube-api-access-w6cnh\") pod \"placement-453e-account-create-krqfz\" (UID: \"8d28696d-ae2e-427d-b247-6ef22dde492f\") " pod="openstack/placement-453e-account-create-krqfz" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.263415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6cnh\" (UniqueName: \"kubernetes.io/projected/8d28696d-ae2e-427d-b247-6ef22dde492f-kube-api-access-w6cnh\") pod \"placement-453e-account-create-krqfz\" (UID: \"8d28696d-ae2e-427d-b247-6ef22dde492f\") " pod="openstack/placement-453e-account-create-krqfz" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.364873 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-453e-account-create-krqfz" Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.369525 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"90a3666d7ec243c20d6f9c6167ca1f8679d60343366a66f8a480d13c588be0eb"} Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.647117 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2734-account-create-kbjj5"] Oct 03 13:12:11 crc kubenswrapper[4962]: W1003 13:12:11.650019 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bb7b0d2_9599_4942_90e3_5bfb712724e0.slice/crio-13bd502929eec0c8beeb4d072b34c02e0c1df71ea61427b9d11a50243496feb0 WatchSource:0}: Error finding container 13bd502929eec0c8beeb4d072b34c02e0c1df71ea61427b9d11a50243496feb0: Status 404 returned error can't find the container with id 13bd502929eec0c8beeb4d072b34c02e0c1df71ea61427b9d11a50243496feb0 Oct 03 13:12:11 crc kubenswrapper[4962]: I1003 13:12:11.788614 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-453e-account-create-krqfz"] Oct 03 13:12:11 crc kubenswrapper[4962]: W1003 13:12:11.794939 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d28696d_ae2e_427d_b247_6ef22dde492f.slice/crio-bfab56ed59990f218835bdc1f7f0a6318bbaaccfa7a79ad43d9d8c8664edab0b WatchSource:0}: Error finding container bfab56ed59990f218835bdc1f7f0a6318bbaaccfa7a79ad43d9d8c8664edab0b: Status 404 returned error can't find the container with id bfab56ed59990f218835bdc1f7f0a6318bbaaccfa7a79ad43d9d8c8664edab0b Oct 03 13:12:12 crc kubenswrapper[4962]: I1003 13:12:12.378033 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-453e-account-create-krqfz" event={"ID":"8d28696d-ae2e-427d-b247-6ef22dde492f","Type":"ContainerStarted","Data":"bfab56ed59990f218835bdc1f7f0a6318bbaaccfa7a79ad43d9d8c8664edab0b"} Oct 03 13:12:12 crc kubenswrapper[4962]: I1003 13:12:12.379169 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2734-account-create-kbjj5" event={"ID":"6bb7b0d2-9599-4942-90e3-5bfb712724e0","Type":"ContainerStarted","Data":"13bd502929eec0c8beeb4d072b34c02e0c1df71ea61427b9d11a50243496feb0"} Oct 03 13:12:13 crc kubenswrapper[4962]: I1003 13:12:13.390316 4962 generic.go:334] "Generic (PLEG): container finished" podID="8d28696d-ae2e-427d-b247-6ef22dde492f" containerID="a12b255b4149f607b5d977552bdbebdcd66a9c203b6d443741a1d7ceceff6ad5" exitCode=0 Oct 03 13:12:13 crc kubenswrapper[4962]: I1003 13:12:13.390409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-453e-account-create-krqfz" event={"ID":"8d28696d-ae2e-427d-b247-6ef22dde492f","Type":"ContainerDied","Data":"a12b255b4149f607b5d977552bdbebdcd66a9c203b6d443741a1d7ceceff6ad5"} Oct 03 13:12:13 crc kubenswrapper[4962]: I1003 13:12:13.392922 4962 generic.go:334] "Generic (PLEG): container finished" podID="6bb7b0d2-9599-4942-90e3-5bfb712724e0" containerID="a74470166961669f8dea3b77b512b7b3a308a86536fe0161bd350804b30907c1" exitCode=0 Oct 03 13:12:13 crc kubenswrapper[4962]: I1003 13:12:13.392965 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2734-account-create-kbjj5" event={"ID":"6bb7b0d2-9599-4942-90e3-5bfb712724e0","Type":"ContainerDied","Data":"a74470166961669f8dea3b77b512b7b3a308a86536fe0161bd350804b30907c1"} Oct 03 13:12:14 crc kubenswrapper[4962]: I1003 13:12:14.404071 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"5674271fdefce19a13c9b336c52d013de2b33a9a9124fca33358f8e3a0cf5881"} Oct 03 13:12:14 crc kubenswrapper[4962]: I1003 13:12:14.404466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"770e06f348aaf3989bf45ae8703e1cff216acdce48c3de88da4323e4ade168ff"} Oct 03 13:12:14 crc kubenswrapper[4962]: I1003 13:12:14.404491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"33714c39a9c3c16769f323fb660866cd0b5a9c6bf72751670fa0465d513c70cb"} Oct 03 13:12:14 crc kubenswrapper[4962]: I1003 13:12:14.404510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"1953908fb8f3a3d9cd983f3f51df79d091442b3b2351b35d0c858fe9e4a4b278"} Oct 03 13:12:14 crc kubenswrapper[4962]: I1003 13:12:14.868107 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2734-account-create-kbjj5" Oct 03 13:12:14 crc kubenswrapper[4962]: I1003 13:12:14.875557 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-453e-account-create-krqfz" Oct 03 13:12:14 crc kubenswrapper[4962]: I1003 13:12:14.904472 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zwd5\" (UniqueName: \"kubernetes.io/projected/6bb7b0d2-9599-4942-90e3-5bfb712724e0-kube-api-access-2zwd5\") pod \"6bb7b0d2-9599-4942-90e3-5bfb712724e0\" (UID: \"6bb7b0d2-9599-4942-90e3-5bfb712724e0\") " Oct 03 13:12:14 crc kubenswrapper[4962]: I1003 13:12:14.926866 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb7b0d2-9599-4942-90e3-5bfb712724e0-kube-api-access-2zwd5" (OuterVolumeSpecName: "kube-api-access-2zwd5") pod "6bb7b0d2-9599-4942-90e3-5bfb712724e0" (UID: "6bb7b0d2-9599-4942-90e3-5bfb712724e0"). InnerVolumeSpecName "kube-api-access-2zwd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.005922 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6cnh\" (UniqueName: \"kubernetes.io/projected/8d28696d-ae2e-427d-b247-6ef22dde492f-kube-api-access-w6cnh\") pod \"8d28696d-ae2e-427d-b247-6ef22dde492f\" (UID: \"8d28696d-ae2e-427d-b247-6ef22dde492f\") " Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.006383 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zwd5\" (UniqueName: \"kubernetes.io/projected/6bb7b0d2-9599-4942-90e3-5bfb712724e0-kube-api-access-2zwd5\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.009972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d28696d-ae2e-427d-b247-6ef22dde492f-kube-api-access-w6cnh" (OuterVolumeSpecName: "kube-api-access-w6cnh") pod "8d28696d-ae2e-427d-b247-6ef22dde492f" (UID: "8d28696d-ae2e-427d-b247-6ef22dde492f"). InnerVolumeSpecName "kube-api-access-w6cnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.107885 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6cnh\" (UniqueName: \"kubernetes.io/projected/8d28696d-ae2e-427d-b247-6ef22dde492f-kube-api-access-w6cnh\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.413439 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-453e-account-create-krqfz" event={"ID":"8d28696d-ae2e-427d-b247-6ef22dde492f","Type":"ContainerDied","Data":"bfab56ed59990f218835bdc1f7f0a6318bbaaccfa7a79ad43d9d8c8664edab0b"} Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.413475 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfab56ed59990f218835bdc1f7f0a6318bbaaccfa7a79ad43d9d8c8664edab0b" Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.413530 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-453e-account-create-krqfz" Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.423510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"0027d40b3fd7f4cac601a15c7999e155ece2e4687617a83b85e72dd63015f85e"} Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.427031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2734-account-create-kbjj5" event={"ID":"6bb7b0d2-9599-4942-90e3-5bfb712724e0","Type":"ContainerDied","Data":"13bd502929eec0c8beeb4d072b34c02e0c1df71ea61427b9d11a50243496feb0"} Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.427055 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13bd502929eec0c8beeb4d072b34c02e0c1df71ea61427b9d11a50243496feb0" Oct 03 13:12:15 crc kubenswrapper[4962]: I1003 13:12:15.427104 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2734-account-create-kbjj5" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.403039 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6sqdm" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerName="ovn-controller" probeResult="failure" output=< Oct 03 13:12:16 crc kubenswrapper[4962]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 13:12:16 crc kubenswrapper[4962]: > Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.440624 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.443009 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2f72-account-create-zwf4b"] Oct 03 13:12:16 crc kubenswrapper[4962]: E1003 13:12:16.443389 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d28696d-ae2e-427d-b247-6ef22dde492f" containerName="mariadb-account-create" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.443405 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d28696d-ae2e-427d-b247-6ef22dde492f" containerName="mariadb-account-create" Oct 03 13:12:16 crc kubenswrapper[4962]: E1003 13:12:16.443418 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb7b0d2-9599-4942-90e3-5bfb712724e0" containerName="mariadb-account-create" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.443424 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb7b0d2-9599-4942-90e3-5bfb712724e0" containerName="mariadb-account-create" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.443609 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb7b0d2-9599-4942-90e3-5bfb712724e0" containerName="mariadb-account-create" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.443676 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d28696d-ae2e-427d-b247-6ef22dde492f" containerName="mariadb-account-create" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.444323 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"bccc65019a49e86470400d5863a1f0f3a4c53c7dead1edd7e4226173d7443ed2"} Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.444355 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"7ce66520fa57254d5157448844d739ec610586be59fb60789632b9b85bd02222"} Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.444369 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"d07a3c21e9f7cc962ded5767c572003a88953fe191cf331895a5e9e48103288b"} Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.444453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2f72-account-create-zwf4b" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.448920 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.452516 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2f72-account-create-zwf4b"] Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.487543 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.527512 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxzh\" (UniqueName: \"kubernetes.io/projected/1f4639da-c3a1-47c8-baf4-155f6d7fdc5c-kube-api-access-kgxzh\") pod \"glance-2f72-account-create-zwf4b\" (UID: \"1f4639da-c3a1-47c8-baf4-155f6d7fdc5c\") " pod="openstack/glance-2f72-account-create-zwf4b" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.629475 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxzh\" (UniqueName: \"kubernetes.io/projected/1f4639da-c3a1-47c8-baf4-155f6d7fdc5c-kube-api-access-kgxzh\") pod \"glance-2f72-account-create-zwf4b\" (UID: \"1f4639da-c3a1-47c8-baf4-155f6d7fdc5c\") " pod="openstack/glance-2f72-account-create-zwf4b" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.649933 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxzh\" (UniqueName: \"kubernetes.io/projected/1f4639da-c3a1-47c8-baf4-155f6d7fdc5c-kube-api-access-kgxzh\") pod \"glance-2f72-account-create-zwf4b\" (UID: \"1f4639da-c3a1-47c8-baf4-155f6d7fdc5c\") " pod="openstack/glance-2f72-account-create-zwf4b" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.694174 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6sqdm-config-gzk6r"] Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.695460 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.698976 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.719036 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6sqdm-config-gzk6r"] Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.734181 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-additional-scripts\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.734259 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ndp7\" (UniqueName: \"kubernetes.io/projected/6797210a-15f7-4161-8af1-d66010ad3057-kube-api-access-9ndp7\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.734305 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-scripts\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.734352 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.734387 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-log-ovn\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.734409 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run-ovn\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.802988 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2f72-account-create-zwf4b" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.835918 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-scripts\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.835995 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.836031 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-log-ovn\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.836056 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run-ovn\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.836092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-additional-scripts\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.836137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ndp7\" (UniqueName: \"kubernetes.io/projected/6797210a-15f7-4161-8af1-d66010ad3057-kube-api-access-9ndp7\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.836514 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-log-ovn\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.836534 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run-ovn\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.836754 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.838864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-additional-scripts\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.840891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-scripts\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:16 crc kubenswrapper[4962]: I1003 13:12:16.858538 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ndp7\" (UniqueName: \"kubernetes.io/projected/6797210a-15f7-4161-8af1-d66010ad3057-kube-api-access-9ndp7\") pod \"ovn-controller-6sqdm-config-gzk6r\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:17 crc kubenswrapper[4962]: I1003 13:12:17.012926 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:17 crc kubenswrapper[4962]: I1003 13:12:17.579618 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2f72-account-create-zwf4b"] Oct 03 13:12:17 crc kubenswrapper[4962]: W1003 13:12:17.590943 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f4639da_c3a1_47c8_baf4_155f6d7fdc5c.slice/crio-bbf8d0fcb51912ca0760dd67c2a56fb971a0d9f83a9ecc5d9b5bbbf943ecd73b WatchSource:0}: Error finding container bbf8d0fcb51912ca0760dd67c2a56fb971a0d9f83a9ecc5d9b5bbbf943ecd73b: Status 404 returned error can't find the container with id bbf8d0fcb51912ca0760dd67c2a56fb971a0d9f83a9ecc5d9b5bbbf943ecd73b Oct 03 13:12:17 crc kubenswrapper[4962]: I1003 13:12:17.831813 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6sqdm-config-gzk6r"] Oct 03 13:12:17 crc kubenswrapper[4962]: W1003 13:12:17.835216 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6797210a_15f7_4161_8af1_d66010ad3057.slice/crio-5ad4445b77c2aa1a050430acbf66d1e0abb40f740c5e3f8d5b0f5147af828d63 WatchSource:0}: Error finding container 5ad4445b77c2aa1a050430acbf66d1e0abb40f740c5e3f8d5b0f5147af828d63: Status 404 returned error can't find the container with id 5ad4445b77c2aa1a050430acbf66d1e0abb40f740c5e3f8d5b0f5147af828d63 Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.018577 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.293658 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.477924 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm-config-gzk6r" event={"ID":"6797210a-15f7-4161-8af1-d66010ad3057","Type":"ContainerStarted","Data":"0eaaa81aa9272a755f71e6e8345983499f134718f443247c4ab7da637b161dee"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.477969 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm-config-gzk6r" event={"ID":"6797210a-15f7-4161-8af1-d66010ad3057","Type":"ContainerStarted","Data":"5ad4445b77c2aa1a050430acbf66d1e0abb40f740c5e3f8d5b0f5147af828d63"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.481895 4962 generic.go:334] "Generic (PLEG): container finished" podID="1f4639da-c3a1-47c8-baf4-155f6d7fdc5c" containerID="153e678f1fc5f369dcef052e9b01146d16065e31e4717a4ecff24fb735e27ee3" exitCode=0 Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.481976 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2f72-account-create-zwf4b" event={"ID":"1f4639da-c3a1-47c8-baf4-155f6d7fdc5c","Type":"ContainerDied","Data":"153e678f1fc5f369dcef052e9b01146d16065e31e4717a4ecff24fb735e27ee3"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.482008 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2f72-account-create-zwf4b" event={"ID":"1f4639da-c3a1-47c8-baf4-155f6d7fdc5c","Type":"ContainerStarted","Data":"bbf8d0fcb51912ca0760dd67c2a56fb971a0d9f83a9ecc5d9b5bbbf943ecd73b"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.487985 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"255dd1c4cb38e6b82f47f9c570da57cc07f7f5e8c11c54bb9966d8c730771ef6"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.488013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"054512bf0c273e329a55f18a262ffbcb7dd5abaded475341723d2b4dc5e849fb"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.488023 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"7e76ff2eb3cf5160a1fdce8ab7db2a70edda0c5fb436d79cb130e11be846580e"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.488031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"f0c27459819cd1d481d672fbdd91f735b40d36cc170361880628b5b806924c13"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.488039 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"a8806f325247419ebf9ee453e77f3493ec2be61562010341eec779899b644330"} Oct 03 13:12:18 crc kubenswrapper[4962]: I1003 13:12:18.525710 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6sqdm-config-gzk6r" podStartSLOduration=2.525692285 podStartE2EDuration="2.525692285s" podCreationTimestamp="2025-10-03 13:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:18.507892446 +0000 UTC m=+1346.911790281" watchObservedRunningTime="2025-10-03 13:12:18.525692285 +0000 UTC m=+1346.929590120" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.500933 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"972fd0b549604163530a4df17ba0265931587abd268311d752380aa374952bb0"} Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.501294 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerStarted","Data":"06a77f3c0c79be2df9a379f7d27bcc5d75db28ce32e20e774dd964566de558be"} Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.503905 4962 generic.go:334] "Generic (PLEG): container finished" podID="6797210a-15f7-4161-8af1-d66010ad3057" containerID="0eaaa81aa9272a755f71e6e8345983499f134718f443247c4ab7da637b161dee" exitCode=0 Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.503975 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm-config-gzk6r" event={"ID":"6797210a-15f7-4161-8af1-d66010ad3057","Type":"ContainerDied","Data":"0eaaa81aa9272a755f71e6e8345983499f134718f443247c4ab7da637b161dee"} Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.545658 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.02171262 podStartE2EDuration="27.545620112s" podCreationTimestamp="2025-10-03 13:11:52 +0000 UTC" firstStartedPulling="2025-10-03 13:12:10.797255876 +0000 UTC m=+1339.201153711" lastFinishedPulling="2025-10-03 13:12:17.321163368 +0000 UTC m=+1345.725061203" observedRunningTime="2025-10-03 13:12:19.53774274 +0000 UTC m=+1347.941640605" watchObservedRunningTime="2025-10-03 13:12:19.545620112 +0000 UTC m=+1347.949517947" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.835968 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-t8t49"] Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.837552 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.838967 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.849515 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-t8t49"] Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.890200 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.890242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-config\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.890261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.890298 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.890393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2p9w\" (UniqueName: \"kubernetes.io/projected/493335aa-ea8d-4197-8c8b-af186d99f4aa-kube-api-access-d2p9w\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.890435 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.892831 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2f72-account-create-zwf4b" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.991945 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgxzh\" (UniqueName: \"kubernetes.io/projected/1f4639da-c3a1-47c8-baf4-155f6d7fdc5c-kube-api-access-kgxzh\") pod \"1f4639da-c3a1-47c8-baf4-155f6d7fdc5c\" (UID: \"1f4639da-c3a1-47c8-baf4-155f6d7fdc5c\") " Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.992395 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.993255 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.992426 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-config\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.993341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.993370 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-config\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.993457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.994255 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.994194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.994315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2p9w\" (UniqueName: \"kubernetes.io/projected/493335aa-ea8d-4197-8c8b-af186d99f4aa-kube-api-access-d2p9w\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.994408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.995092 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:19 crc kubenswrapper[4962]: I1003 13:12:19.999226 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f4639da-c3a1-47c8-baf4-155f6d7fdc5c-kube-api-access-kgxzh" (OuterVolumeSpecName: "kube-api-access-kgxzh") pod "1f4639da-c3a1-47c8-baf4-155f6d7fdc5c" (UID: "1f4639da-c3a1-47c8-baf4-155f6d7fdc5c"). InnerVolumeSpecName "kube-api-access-kgxzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.014155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2p9w\" (UniqueName: \"kubernetes.io/projected/493335aa-ea8d-4197-8c8b-af186d99f4aa-kube-api-access-d2p9w\") pod \"dnsmasq-dns-77585f5f8c-t8t49\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.095505 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgxzh\" (UniqueName: \"kubernetes.io/projected/1f4639da-c3a1-47c8-baf4-155f6d7fdc5c-kube-api-access-kgxzh\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.204425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.513693 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2f72-account-create-zwf4b" event={"ID":"1f4639da-c3a1-47c8-baf4-155f6d7fdc5c","Type":"ContainerDied","Data":"bbf8d0fcb51912ca0760dd67c2a56fb971a0d9f83a9ecc5d9b5bbbf943ecd73b"} Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.513989 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf8d0fcb51912ca0760dd67c2a56fb971a0d9f83a9ecc5d9b5bbbf943ecd73b" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.515604 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2f72-account-create-zwf4b" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.630329 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-t8t49"] Oct 03 13:12:20 crc kubenswrapper[4962]: W1003 13:12:20.653237 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493335aa_ea8d_4197_8c8b_af186d99f4aa.slice/crio-cff96cacf42345f16b0ae68dc79c84fef7445cba2d2d43c3cc83a5161837f962 WatchSource:0}: Error finding container cff96cacf42345f16b0ae68dc79c84fef7445cba2d2d43c3cc83a5161837f962: Status 404 returned error can't find the container with id cff96cacf42345f16b0ae68dc79c84fef7445cba2d2d43c3cc83a5161837f962 Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.808588 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.910547 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ndp7\" (UniqueName: \"kubernetes.io/projected/6797210a-15f7-4161-8af1-d66010ad3057-kube-api-access-9ndp7\") pod \"6797210a-15f7-4161-8af1-d66010ad3057\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.910612 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run\") pod \"6797210a-15f7-4161-8af1-d66010ad3057\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.910697 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-scripts\") pod \"6797210a-15f7-4161-8af1-d66010ad3057\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.910727 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run-ovn\") pod \"6797210a-15f7-4161-8af1-d66010ad3057\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.910805 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-log-ovn\") pod \"6797210a-15f7-4161-8af1-d66010ad3057\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.910840 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-additional-scripts\") pod \"6797210a-15f7-4161-8af1-d66010ad3057\" (UID: \"6797210a-15f7-4161-8af1-d66010ad3057\") " Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.911446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run" (OuterVolumeSpecName: "var-run") pod "6797210a-15f7-4161-8af1-d66010ad3057" (UID: "6797210a-15f7-4161-8af1-d66010ad3057"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.911513 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6797210a-15f7-4161-8af1-d66010ad3057" (UID: "6797210a-15f7-4161-8af1-d66010ad3057"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.911513 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6797210a-15f7-4161-8af1-d66010ad3057" (UID: "6797210a-15f7-4161-8af1-d66010ad3057"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.911950 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6797210a-15f7-4161-8af1-d66010ad3057" (UID: "6797210a-15f7-4161-8af1-d66010ad3057"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.912395 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-scripts" (OuterVolumeSpecName: "scripts") pod "6797210a-15f7-4161-8af1-d66010ad3057" (UID: "6797210a-15f7-4161-8af1-d66010ad3057"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.917616 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6797210a-15f7-4161-8af1-d66010ad3057-kube-api-access-9ndp7" (OuterVolumeSpecName: "kube-api-access-9ndp7") pod "6797210a-15f7-4161-8af1-d66010ad3057" (UID: "6797210a-15f7-4161-8af1-d66010ad3057"). InnerVolumeSpecName "kube-api-access-9ndp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.927176 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6sqdm-config-gzk6r"] Oct 03 13:12:20 crc kubenswrapper[4962]: I1003 13:12:20.939715 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6sqdm-config-gzk6r"] Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.012593 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ndp7\" (UniqueName: \"kubernetes.io/projected/6797210a-15f7-4161-8af1-d66010ad3057-kube-api-access-9ndp7\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.012624 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.012650 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.012660 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.012672 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6797210a-15f7-4161-8af1-d66010ad3057-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.012681 4962 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6797210a-15f7-4161-8af1-d66010ad3057-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.030107 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6sqdm-config-8xcfx"] Oct 03 13:12:21 crc kubenswrapper[4962]: E1003 13:12:21.030500 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f4639da-c3a1-47c8-baf4-155f6d7fdc5c" containerName="mariadb-account-create" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.030519 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f4639da-c3a1-47c8-baf4-155f6d7fdc5c" containerName="mariadb-account-create" Oct 03 13:12:21 crc kubenswrapper[4962]: E1003 13:12:21.030557 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6797210a-15f7-4161-8af1-d66010ad3057" containerName="ovn-config" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.030565 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6797210a-15f7-4161-8af1-d66010ad3057" containerName="ovn-config" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.030777 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6797210a-15f7-4161-8af1-d66010ad3057" containerName="ovn-config" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.030815 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f4639da-c3a1-47c8-baf4-155f6d7fdc5c" containerName="mariadb-account-create" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.031466 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.044778 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6sqdm-config-8xcfx"] Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.113562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpr6m\" (UniqueName: \"kubernetes.io/projected/1cc091e2-012f-47bf-b8a7-6852c834f338-kube-api-access-lpr6m\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.113813 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-scripts\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.113852 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.113883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run-ovn\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.113933 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-additional-scripts\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.113964 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-log-ovn\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215317 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215363 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run-ovn\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-additional-scripts\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215411 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-log-ovn\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpr6m\" (UniqueName: \"kubernetes.io/projected/1cc091e2-012f-47bf-b8a7-6852c834f338-kube-api-access-lpr6m\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215567 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-scripts\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215670 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run-ovn\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215754 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-log-ovn\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.215754 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.216209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-additional-scripts\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.217223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-scripts\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.232421 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpr6m\" (UniqueName: \"kubernetes.io/projected/1cc091e2-012f-47bf-b8a7-6852c834f338-kube-api-access-lpr6m\") pod \"ovn-controller-6sqdm-config-8xcfx\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.356576 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6sqdm" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.376930 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.528463 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad4445b77c2aa1a050430acbf66d1e0abb40f740c5e3f8d5b0f5147af828d63" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.529041 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm-config-gzk6r" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.541363 4962 generic.go:334] "Generic (PLEG): container finished" podID="493335aa-ea8d-4197-8c8b-af186d99f4aa" containerID="a39cb5e7e2e55905be6936881fda8317e48fd44199a916066642512d2832a221" exitCode=0 Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.541593 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" event={"ID":"493335aa-ea8d-4197-8c8b-af186d99f4aa","Type":"ContainerDied","Data":"a39cb5e7e2e55905be6936881fda8317e48fd44199a916066642512d2832a221"} Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.541624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" event={"ID":"493335aa-ea8d-4197-8c8b-af186d99f4aa","Type":"ContainerStarted","Data":"cff96cacf42345f16b0ae68dc79c84fef7445cba2d2d43c3cc83a5161837f962"} Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.681780 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wrqgr"] Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.683297 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.685768 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s2g9d" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.686053 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.689446 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wrqgr"] Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.726069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlsh\" (UniqueName: \"kubernetes.io/projected/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-kube-api-access-wmlsh\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.726141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-db-sync-config-data\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.726236 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-config-data\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.726268 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-combined-ca-bundle\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.827527 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlsh\" (UniqueName: \"kubernetes.io/projected/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-kube-api-access-wmlsh\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.827583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-db-sync-config-data\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.827662 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-config-data\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.827686 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-combined-ca-bundle\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.832970 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-config-data\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.833074 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-combined-ca-bundle\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.833657 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-db-sync-config-data\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.846415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlsh\" (UniqueName: \"kubernetes.io/projected/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-kube-api-access-wmlsh\") pod \"glance-db-sync-wrqgr\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:21 crc kubenswrapper[4962]: I1003 13:12:21.860207 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6sqdm-config-8xcfx"] Oct 03 13:12:21 crc kubenswrapper[4962]: W1003 13:12:21.863440 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc091e2_012f_47bf_b8a7_6852c834f338.slice/crio-884e78cb3a747bd35820bb5ad4bde6017af6c70afe9bfb83b3ad48a5e5961ec8 WatchSource:0}: Error finding container 884e78cb3a747bd35820bb5ad4bde6017af6c70afe9bfb83b3ad48a5e5961ec8: Status 404 returned error can't find the container with id 884e78cb3a747bd35820bb5ad4bde6017af6c70afe9bfb83b3ad48a5e5961ec8 Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.013002 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.249376 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6797210a-15f7-4161-8af1-d66010ad3057" path="/var/lib/kubelet/pods/6797210a-15f7-4161-8af1-d66010ad3057/volumes" Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.550590 4962 generic.go:334] "Generic (PLEG): container finished" podID="1cc091e2-012f-47bf-b8a7-6852c834f338" containerID="8642f3d9a3347e1307aee57c47fa850da1480a1bb21bb726d495551c0297fb08" exitCode=0 Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.550722 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm-config-8xcfx" event={"ID":"1cc091e2-012f-47bf-b8a7-6852c834f338","Type":"ContainerDied","Data":"8642f3d9a3347e1307aee57c47fa850da1480a1bb21bb726d495551c0297fb08"} Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.550952 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm-config-8xcfx" event={"ID":"1cc091e2-012f-47bf-b8a7-6852c834f338","Type":"ContainerStarted","Data":"884e78cb3a747bd35820bb5ad4bde6017af6c70afe9bfb83b3ad48a5e5961ec8"} Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.552984 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" event={"ID":"493335aa-ea8d-4197-8c8b-af186d99f4aa","Type":"ContainerStarted","Data":"4dca5aa90c57dbb61b1e0f0cfad418d1058fe0f35307c84e272b53f55c58252e"} Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.553736 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.573575 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wrqgr"] Oct 03 13:12:22 crc kubenswrapper[4962]: W1003 13:12:22.581506 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4e7b0e6_13e7_4614_b7ee_23c87a2f4d56.slice/crio-deb461a4f17a7f355c82858d1d7b043343d1b58137e6482bab4b080070c1fc31 WatchSource:0}: Error finding container deb461a4f17a7f355c82858d1d7b043343d1b58137e6482bab4b080070c1fc31: Status 404 returned error can't find the container with id deb461a4f17a7f355c82858d1d7b043343d1b58137e6482bab4b080070c1fc31 Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.584123 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:12:22 crc kubenswrapper[4962]: I1003 13:12:22.609502 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" podStartSLOduration=3.609480177 podStartE2EDuration="3.609480177s" podCreationTimestamp="2025-10-03 13:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:22.594405311 +0000 UTC m=+1350.998303146" watchObservedRunningTime="2025-10-03 13:12:22.609480177 +0000 UTC m=+1351.013378012" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.562693 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrqgr" event={"ID":"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56","Type":"ContainerStarted","Data":"deb461a4f17a7f355c82858d1d7b043343d1b58137e6482bab4b080070c1fc31"} Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.869050 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967080 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run\") pod \"1cc091e2-012f-47bf-b8a7-6852c834f338\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-additional-scripts\") pod \"1cc091e2-012f-47bf-b8a7-6852c834f338\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967229 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run-ovn\") pod \"1cc091e2-012f-47bf-b8a7-6852c834f338\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967236 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run" (OuterVolumeSpecName: "var-run") pod "1cc091e2-012f-47bf-b8a7-6852c834f338" (UID: "1cc091e2-012f-47bf-b8a7-6852c834f338"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967255 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-log-ovn\") pod \"1cc091e2-012f-47bf-b8a7-6852c834f338\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967296 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1cc091e2-012f-47bf-b8a7-6852c834f338" (UID: "1cc091e2-012f-47bf-b8a7-6852c834f338"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967301 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1cc091e2-012f-47bf-b8a7-6852c834f338" (UID: "1cc091e2-012f-47bf-b8a7-6852c834f338"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967379 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpr6m\" (UniqueName: \"kubernetes.io/projected/1cc091e2-012f-47bf-b8a7-6852c834f338-kube-api-access-lpr6m\") pod \"1cc091e2-012f-47bf-b8a7-6852c834f338\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967479 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-scripts\") pod \"1cc091e2-012f-47bf-b8a7-6852c834f338\" (UID: \"1cc091e2-012f-47bf-b8a7-6852c834f338\") " Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.967707 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1cc091e2-012f-47bf-b8a7-6852c834f338" (UID: "1cc091e2-012f-47bf-b8a7-6852c834f338"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.968294 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-scripts" (OuterVolumeSpecName: "scripts") pod "1cc091e2-012f-47bf-b8a7-6852c834f338" (UID: "1cc091e2-012f-47bf-b8a7-6852c834f338"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.968300 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.968344 4962 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.968356 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.968364 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1cc091e2-012f-47bf-b8a7-6852c834f338-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:23 crc kubenswrapper[4962]: I1003 13:12:23.973230 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc091e2-012f-47bf-b8a7-6852c834f338-kube-api-access-lpr6m" (OuterVolumeSpecName: "kube-api-access-lpr6m") pod "1cc091e2-012f-47bf-b8a7-6852c834f338" (UID: "1cc091e2-012f-47bf-b8a7-6852c834f338"). InnerVolumeSpecName "kube-api-access-lpr6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:24 crc kubenswrapper[4962]: I1003 13:12:24.070017 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpr6m\" (UniqueName: \"kubernetes.io/projected/1cc091e2-012f-47bf-b8a7-6852c834f338-kube-api-access-lpr6m\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:24 crc kubenswrapper[4962]: I1003 13:12:24.070060 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cc091e2-012f-47bf-b8a7-6852c834f338-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:24 crc kubenswrapper[4962]: I1003 13:12:24.572769 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm-config-8xcfx" event={"ID":"1cc091e2-012f-47bf-b8a7-6852c834f338","Type":"ContainerDied","Data":"884e78cb3a747bd35820bb5ad4bde6017af6c70afe9bfb83b3ad48a5e5961ec8"} Oct 03 13:12:24 crc kubenswrapper[4962]: I1003 13:12:24.572814 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm-config-8xcfx" Oct 03 13:12:24 crc kubenswrapper[4962]: I1003 13:12:24.572826 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="884e78cb3a747bd35820bb5ad4bde6017af6c70afe9bfb83b3ad48a5e5961ec8" Oct 03 13:12:24 crc kubenswrapper[4962]: I1003 13:12:24.953883 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6sqdm-config-8xcfx"] Oct 03 13:12:24 crc kubenswrapper[4962]: I1003 13:12:24.962226 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6sqdm-config-8xcfx"] Oct 03 13:12:26 crc kubenswrapper[4962]: I1003 13:12:26.236831 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc091e2-012f-47bf-b8a7-6852c834f338" path="/var/lib/kubelet/pods/1cc091e2-012f-47bf-b8a7-6852c834f338/volumes" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.018834 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.294824 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.420503 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-b2fwt"] Oct 03 13:12:28 crc kubenswrapper[4962]: E1003 13:12:28.420937 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc091e2-012f-47bf-b8a7-6852c834f338" containerName="ovn-config" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.420956 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc091e2-012f-47bf-b8a7-6852c834f338" containerName="ovn-config" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.421177 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc091e2-012f-47bf-b8a7-6852c834f338" containerName="ovn-config" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.421809 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b2fwt" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.438473 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-b2fwt"] Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.500471 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-q9c2h"] Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.501496 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q9c2h" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.515149 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-q9c2h"] Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.584332 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llr8r\" (UniqueName: \"kubernetes.io/projected/68b7fa90-3c0c-4c9d-9709-fdd39699b685-kube-api-access-llr8r\") pod \"barbican-db-create-b2fwt\" (UID: \"68b7fa90-3c0c-4c9d-9709-fdd39699b685\") " pod="openstack/barbican-db-create-b2fwt" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.584376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkc9n\" (UniqueName: \"kubernetes.io/projected/d8c59dc3-1749-4e12-aa01-09b6ed9934d8-kube-api-access-fkc9n\") pod \"cinder-db-create-q9c2h\" (UID: \"d8c59dc3-1749-4e12-aa01-09b6ed9934d8\") " pod="openstack/cinder-db-create-q9c2h" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.666846 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xgdv7"] Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.667843 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.671971 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.672653 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.673207 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rbx44" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.673389 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.685310 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llr8r\" (UniqueName: \"kubernetes.io/projected/68b7fa90-3c0c-4c9d-9709-fdd39699b685-kube-api-access-llr8r\") pod \"barbican-db-create-b2fwt\" (UID: \"68b7fa90-3c0c-4c9d-9709-fdd39699b685\") " pod="openstack/barbican-db-create-b2fwt" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.685349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkc9n\" (UniqueName: \"kubernetes.io/projected/d8c59dc3-1749-4e12-aa01-09b6ed9934d8-kube-api-access-fkc9n\") pod \"cinder-db-create-q9c2h\" (UID: \"d8c59dc3-1749-4e12-aa01-09b6ed9934d8\") " pod="openstack/cinder-db-create-q9c2h" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.687580 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xgdv7"] Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.708543 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llr8r\" (UniqueName: \"kubernetes.io/projected/68b7fa90-3c0c-4c9d-9709-fdd39699b685-kube-api-access-llr8r\") pod \"barbican-db-create-b2fwt\" (UID: \"68b7fa90-3c0c-4c9d-9709-fdd39699b685\") " pod="openstack/barbican-db-create-b2fwt" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.710847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkc9n\" (UniqueName: \"kubernetes.io/projected/d8c59dc3-1749-4e12-aa01-09b6ed9934d8-kube-api-access-fkc9n\") pod \"cinder-db-create-q9c2h\" (UID: \"d8c59dc3-1749-4e12-aa01-09b6ed9934d8\") " pod="openstack/cinder-db-create-q9c2h" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.744422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b2fwt" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.787579 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-config-data\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.787734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlgfp\" (UniqueName: \"kubernetes.io/projected/a79e40bc-548c-44b8-8457-c0b8195c6436-kube-api-access-wlgfp\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.787759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-combined-ca-bundle\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.793526 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mkw7j"] Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.794718 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mkw7j" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.823599 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mkw7j"] Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.830813 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q9c2h" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.888787 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlgfp\" (UniqueName: \"kubernetes.io/projected/a79e40bc-548c-44b8-8457-c0b8195c6436-kube-api-access-wlgfp\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.888832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-combined-ca-bundle\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.888877 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8swl\" (UniqueName: \"kubernetes.io/projected/4e7ff309-1e7f-4954-a08b-a5cf72982735-kube-api-access-j8swl\") pod \"neutron-db-create-mkw7j\" (UID: \"4e7ff309-1e7f-4954-a08b-a5cf72982735\") " pod="openstack/neutron-db-create-mkw7j" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.888947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-config-data\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.892965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-combined-ca-bundle\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.892976 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-config-data\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.910477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlgfp\" (UniqueName: \"kubernetes.io/projected/a79e40bc-548c-44b8-8457-c0b8195c6436-kube-api-access-wlgfp\") pod \"keystone-db-sync-xgdv7\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:28 crc kubenswrapper[4962]: I1003 13:12:28.990832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8swl\" (UniqueName: \"kubernetes.io/projected/4e7ff309-1e7f-4954-a08b-a5cf72982735-kube-api-access-j8swl\") pod \"neutron-db-create-mkw7j\" (UID: \"4e7ff309-1e7f-4954-a08b-a5cf72982735\") " pod="openstack/neutron-db-create-mkw7j" Oct 03 13:12:29 crc kubenswrapper[4962]: I1003 13:12:29.003756 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:29 crc kubenswrapper[4962]: I1003 13:12:29.036399 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8swl\" (UniqueName: \"kubernetes.io/projected/4e7ff309-1e7f-4954-a08b-a5cf72982735-kube-api-access-j8swl\") pod \"neutron-db-create-mkw7j\" (UID: \"4e7ff309-1e7f-4954-a08b-a5cf72982735\") " pod="openstack/neutron-db-create-mkw7j" Oct 03 13:12:29 crc kubenswrapper[4962]: I1003 13:12:29.178798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mkw7j" Oct 03 13:12:30 crc kubenswrapper[4962]: I1003 13:12:30.206149 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:30 crc kubenswrapper[4962]: I1003 13:12:30.274292 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8rz95"] Oct 03 13:12:30 crc kubenswrapper[4962]: I1003 13:12:30.274583 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-8rz95" podUID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerName="dnsmasq-dns" containerID="cri-o://1386cbaa0e17306a368bb4a2806672e2c52aaea5f063190c3cad125a74ffa7fc" gracePeriod=10 Oct 03 13:12:30 crc kubenswrapper[4962]: I1003 13:12:30.625683 4962 generic.go:334] "Generic (PLEG): container finished" podID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerID="1386cbaa0e17306a368bb4a2806672e2c52aaea5f063190c3cad125a74ffa7fc" exitCode=0 Oct 03 13:12:30 crc kubenswrapper[4962]: I1003 13:12:30.625726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8rz95" event={"ID":"ed812fda-dbe5-48ff-bea4-3795a87d716f","Type":"ContainerDied","Data":"1386cbaa0e17306a368bb4a2806672e2c52aaea5f063190c3cad125a74ffa7fc"} Oct 03 13:12:33 crc kubenswrapper[4962]: I1003 13:12:33.247445 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-8rz95" podUID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.492970 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.601473 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-sb\") pod \"ed812fda-dbe5-48ff-bea4-3795a87d716f\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.601549 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbq6m\" (UniqueName: \"kubernetes.io/projected/ed812fda-dbe5-48ff-bea4-3795a87d716f-kube-api-access-kbq6m\") pod \"ed812fda-dbe5-48ff-bea4-3795a87d716f\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.601604 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-nb\") pod \"ed812fda-dbe5-48ff-bea4-3795a87d716f\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.601705 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-config\") pod \"ed812fda-dbe5-48ff-bea4-3795a87d716f\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.601770 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-dns-svc\") pod \"ed812fda-dbe5-48ff-bea4-3795a87d716f\" (UID: \"ed812fda-dbe5-48ff-bea4-3795a87d716f\") " Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.608127 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed812fda-dbe5-48ff-bea4-3795a87d716f-kube-api-access-kbq6m" (OuterVolumeSpecName: "kube-api-access-kbq6m") pod "ed812fda-dbe5-48ff-bea4-3795a87d716f" (UID: "ed812fda-dbe5-48ff-bea4-3795a87d716f"). InnerVolumeSpecName "kube-api-access-kbq6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.646737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed812fda-dbe5-48ff-bea4-3795a87d716f" (UID: "ed812fda-dbe5-48ff-bea4-3795a87d716f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.646794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-config" (OuterVolumeSpecName: "config") pod "ed812fda-dbe5-48ff-bea4-3795a87d716f" (UID: "ed812fda-dbe5-48ff-bea4-3795a87d716f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:35 crc kubenswrapper[4962]: I1003 13:12:35.646835 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed812fda-dbe5-48ff-bea4-3795a87d716f" (UID: "ed812fda-dbe5-48ff-bea4-3795a87d716f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.657434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed812fda-dbe5-48ff-bea4-3795a87d716f" (UID: "ed812fda-dbe5-48ff-bea4-3795a87d716f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.673089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8rz95" event={"ID":"ed812fda-dbe5-48ff-bea4-3795a87d716f","Type":"ContainerDied","Data":"42352cc4ebf550ac11dddb148b8cec0f43011b77402fd829ad69da3b4d0693e2"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.673144 4962 scope.go:117] "RemoveContainer" containerID="1386cbaa0e17306a368bb4a2806672e2c52aaea5f063190c3cad125a74ffa7fc" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.673280 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8rz95" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.704127 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbq6m\" (UniqueName: \"kubernetes.io/projected/ed812fda-dbe5-48ff-bea4-3795a87d716f-kube-api-access-kbq6m\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.704151 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.704162 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.704171 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.704179 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed812fda-dbe5-48ff-bea4-3795a87d716f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.711243 4962 scope.go:117] "RemoveContainer" containerID="678ed25fe26650549b04412d89b2a8a75f57962aa6c88529e9fc7bc5487bf4b1" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.720461 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8rz95"] Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.727243 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8rz95"] Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.789374 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-b2fwt"] Oct 03 13:12:36 crc kubenswrapper[4962]: W1003 13:12:35.791254 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b7fa90_3c0c_4c9d_9709_fdd39699b685.slice/crio-e62e0eae908c24728673ff402254f0232197fbe917637863aeb4aeee506aee7c WatchSource:0}: Error finding container e62e0eae908c24728673ff402254f0232197fbe917637863aeb4aeee506aee7c: Status 404 returned error can't find the container with id e62e0eae908c24728673ff402254f0232197fbe917637863aeb4aeee506aee7c Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:35.835756 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xgdv7"] Oct 03 13:12:36 crc kubenswrapper[4962]: W1003 13:12:35.838810 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79e40bc_548c_44b8_8457_c0b8195c6436.slice/crio-758317faceea078aa6b00f81960ad471ac9655d0afb89fcfc425972e9c32e679 WatchSource:0}: Error finding container 758317faceea078aa6b00f81960ad471ac9655d0afb89fcfc425972e9c32e679: Status 404 returned error can't find the container with id 758317faceea078aa6b00f81960ad471ac9655d0afb89fcfc425972e9c32e679 Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.253767 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed812fda-dbe5-48ff-bea4-3795a87d716f" path="/var/lib/kubelet/pods/ed812fda-dbe5-48ff-bea4-3795a87d716f/volumes" Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.352167 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-q9c2h"] Oct 03 13:12:36 crc kubenswrapper[4962]: W1003 13:12:36.352495 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e7ff309_1e7f_4954_a08b_a5cf72982735.slice/crio-f2b48705c33142e701d635b29142e86f6f007906d4fc10daac19e34cb1fff4d5 WatchSource:0}: Error finding container f2b48705c33142e701d635b29142e86f6f007906d4fc10daac19e34cb1fff4d5: Status 404 returned error can't find the container with id f2b48705c33142e701d635b29142e86f6f007906d4fc10daac19e34cb1fff4d5 Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.360565 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mkw7j"] Oct 03 13:12:36 crc kubenswrapper[4962]: W1003 13:12:36.362954 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8c59dc3_1749_4e12_aa01_09b6ed9934d8.slice/crio-187cd06667205ace0ec82eb153672d369e83d45c49a3bedaf3bde91514842eed WatchSource:0}: Error finding container 187cd06667205ace0ec82eb153672d369e83d45c49a3bedaf3bde91514842eed: Status 404 returned error can't find the container with id 187cd06667205ace0ec82eb153672d369e83d45c49a3bedaf3bde91514842eed Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.689686 4962 generic.go:334] "Generic (PLEG): container finished" podID="d8c59dc3-1749-4e12-aa01-09b6ed9934d8" containerID="91f467d8010c6b11f362e5c8799a00178a2a8b03f7e9bf6ffe2a6502f8bb3a5c" exitCode=0 Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.689765 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q9c2h" event={"ID":"d8c59dc3-1749-4e12-aa01-09b6ed9934d8","Type":"ContainerDied","Data":"91f467d8010c6b11f362e5c8799a00178a2a8b03f7e9bf6ffe2a6502f8bb3a5c"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.689844 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q9c2h" event={"ID":"d8c59dc3-1749-4e12-aa01-09b6ed9934d8","Type":"ContainerStarted","Data":"187cd06667205ace0ec82eb153672d369e83d45c49a3bedaf3bde91514842eed"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.691752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xgdv7" event={"ID":"a79e40bc-548c-44b8-8457-c0b8195c6436","Type":"ContainerStarted","Data":"758317faceea078aa6b00f81960ad471ac9655d0afb89fcfc425972e9c32e679"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.693368 4962 generic.go:334] "Generic (PLEG): container finished" podID="68b7fa90-3c0c-4c9d-9709-fdd39699b685" containerID="858a9acb02475f1fa475e73f4f54f3c31ea0b2dd734ab2184ad2c2ef50d094b1" exitCode=0 Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.693434 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b2fwt" event={"ID":"68b7fa90-3c0c-4c9d-9709-fdd39699b685","Type":"ContainerDied","Data":"858a9acb02475f1fa475e73f4f54f3c31ea0b2dd734ab2184ad2c2ef50d094b1"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.693456 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b2fwt" event={"ID":"68b7fa90-3c0c-4c9d-9709-fdd39699b685","Type":"ContainerStarted","Data":"e62e0eae908c24728673ff402254f0232197fbe917637863aeb4aeee506aee7c"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.695074 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrqgr" event={"ID":"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56","Type":"ContainerStarted","Data":"dfa93acbbb91865d0cb4ff792e49c1fab6703cc34cf33248fb44323393847688"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.696415 4962 generic.go:334] "Generic (PLEG): container finished" podID="4e7ff309-1e7f-4954-a08b-a5cf72982735" containerID="9128b3882b18799c52b8901cee877983abb64473aa57b7378c6cd26e45351340" exitCode=0 Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.696446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mkw7j" event={"ID":"4e7ff309-1e7f-4954-a08b-a5cf72982735","Type":"ContainerDied","Data":"9128b3882b18799c52b8901cee877983abb64473aa57b7378c6cd26e45351340"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.696465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mkw7j" event={"ID":"4e7ff309-1e7f-4954-a08b-a5cf72982735","Type":"ContainerStarted","Data":"f2b48705c33142e701d635b29142e86f6f007906d4fc10daac19e34cb1fff4d5"} Oct 03 13:12:36 crc kubenswrapper[4962]: I1003 13:12:36.714312 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wrqgr" podStartSLOduration=2.933978106 podStartE2EDuration="15.714245936s" podCreationTimestamp="2025-10-03 13:12:21 +0000 UTC" firstStartedPulling="2025-10-03 13:12:22.583863027 +0000 UTC m=+1350.987760862" lastFinishedPulling="2025-10-03 13:12:35.364130857 +0000 UTC m=+1363.768028692" observedRunningTime="2025-10-03 13:12:36.713171387 +0000 UTC m=+1365.117069222" watchObservedRunningTime="2025-10-03 13:12:36.714245936 +0000 UTC m=+1365.118143791" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.161941 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mkw7j" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.167809 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q9c2h" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.173351 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b2fwt" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.275446 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llr8r\" (UniqueName: \"kubernetes.io/projected/68b7fa90-3c0c-4c9d-9709-fdd39699b685-kube-api-access-llr8r\") pod \"68b7fa90-3c0c-4c9d-9709-fdd39699b685\" (UID: \"68b7fa90-3c0c-4c9d-9709-fdd39699b685\") " Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.275562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8swl\" (UniqueName: \"kubernetes.io/projected/4e7ff309-1e7f-4954-a08b-a5cf72982735-kube-api-access-j8swl\") pod \"4e7ff309-1e7f-4954-a08b-a5cf72982735\" (UID: \"4e7ff309-1e7f-4954-a08b-a5cf72982735\") " Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.275632 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkc9n\" (UniqueName: \"kubernetes.io/projected/d8c59dc3-1749-4e12-aa01-09b6ed9934d8-kube-api-access-fkc9n\") pod \"d8c59dc3-1749-4e12-aa01-09b6ed9934d8\" (UID: \"d8c59dc3-1749-4e12-aa01-09b6ed9934d8\") " Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.286163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c59dc3-1749-4e12-aa01-09b6ed9934d8-kube-api-access-fkc9n" (OuterVolumeSpecName: "kube-api-access-fkc9n") pod "d8c59dc3-1749-4e12-aa01-09b6ed9934d8" (UID: "d8c59dc3-1749-4e12-aa01-09b6ed9934d8"). InnerVolumeSpecName "kube-api-access-fkc9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.286941 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b7fa90-3c0c-4c9d-9709-fdd39699b685-kube-api-access-llr8r" (OuterVolumeSpecName: "kube-api-access-llr8r") pod "68b7fa90-3c0c-4c9d-9709-fdd39699b685" (UID: "68b7fa90-3c0c-4c9d-9709-fdd39699b685"). InnerVolumeSpecName "kube-api-access-llr8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.288969 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e7ff309-1e7f-4954-a08b-a5cf72982735-kube-api-access-j8swl" (OuterVolumeSpecName: "kube-api-access-j8swl") pod "4e7ff309-1e7f-4954-a08b-a5cf72982735" (UID: "4e7ff309-1e7f-4954-a08b-a5cf72982735"). InnerVolumeSpecName "kube-api-access-j8swl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.377472 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8swl\" (UniqueName: \"kubernetes.io/projected/4e7ff309-1e7f-4954-a08b-a5cf72982735-kube-api-access-j8swl\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.377514 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkc9n\" (UniqueName: \"kubernetes.io/projected/d8c59dc3-1749-4e12-aa01-09b6ed9934d8-kube-api-access-fkc9n\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.377523 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llr8r\" (UniqueName: \"kubernetes.io/projected/68b7fa90-3c0c-4c9d-9709-fdd39699b685-kube-api-access-llr8r\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.729807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mkw7j" event={"ID":"4e7ff309-1e7f-4954-a08b-a5cf72982735","Type":"ContainerDied","Data":"f2b48705c33142e701d635b29142e86f6f007906d4fc10daac19e34cb1fff4d5"} Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.729850 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2b48705c33142e701d635b29142e86f6f007906d4fc10daac19e34cb1fff4d5" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.729891 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mkw7j" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.731770 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q9c2h" event={"ID":"d8c59dc3-1749-4e12-aa01-09b6ed9934d8","Type":"ContainerDied","Data":"187cd06667205ace0ec82eb153672d369e83d45c49a3bedaf3bde91514842eed"} Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.731799 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="187cd06667205ace0ec82eb153672d369e83d45c49a3bedaf3bde91514842eed" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.731808 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q9c2h" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.733364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b2fwt" event={"ID":"68b7fa90-3c0c-4c9d-9709-fdd39699b685","Type":"ContainerDied","Data":"e62e0eae908c24728673ff402254f0232197fbe917637863aeb4aeee506aee7c"} Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.733396 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e62e0eae908c24728673ff402254f0232197fbe917637863aeb4aeee506aee7c" Oct 03 13:12:40 crc kubenswrapper[4962]: I1003 13:12:40.733443 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b2fwt" Oct 03 13:12:41 crc kubenswrapper[4962]: I1003 13:12:41.742588 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xgdv7" event={"ID":"a79e40bc-548c-44b8-8457-c0b8195c6436","Type":"ContainerStarted","Data":"5fc9ed21237586b0dba4b83e5d63c7c19a71fd2d4afd7ab0b4d8c03e1f54bb41"} Oct 03 13:12:41 crc kubenswrapper[4962]: I1003 13:12:41.768227 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xgdv7" podStartSLOduration=8.371978248 podStartE2EDuration="13.768208515s" podCreationTimestamp="2025-10-03 13:12:28 +0000 UTC" firstStartedPulling="2025-10-03 13:12:35.840830451 +0000 UTC m=+1364.244728286" lastFinishedPulling="2025-10-03 13:12:41.237060718 +0000 UTC m=+1369.640958553" observedRunningTime="2025-10-03 13:12:41.762708277 +0000 UTC m=+1370.166606132" watchObservedRunningTime="2025-10-03 13:12:41.768208515 +0000 UTC m=+1370.172106350" Oct 03 13:12:42 crc kubenswrapper[4962]: I1003 13:12:42.752317 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" containerID="dfa93acbbb91865d0cb4ff792e49c1fab6703cc34cf33248fb44323393847688" exitCode=0 Oct 03 13:12:42 crc kubenswrapper[4962]: I1003 13:12:42.752406 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrqgr" event={"ID":"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56","Type":"ContainerDied","Data":"dfa93acbbb91865d0cb4ff792e49c1fab6703cc34cf33248fb44323393847688"} Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.137895 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.241518 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-db-sync-config-data\") pod \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.241558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-combined-ca-bundle\") pod \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.241660 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmlsh\" (UniqueName: \"kubernetes.io/projected/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-kube-api-access-wmlsh\") pod \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.241697 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-config-data\") pod \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\" (UID: \"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56\") " Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.247008 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-kube-api-access-wmlsh" (OuterVolumeSpecName: "kube-api-access-wmlsh") pod "f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" (UID: "f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56"). InnerVolumeSpecName "kube-api-access-wmlsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.250328 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" (UID: "f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.272462 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" (UID: "f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.290802 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-config-data" (OuterVolumeSpecName: "config-data") pod "f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" (UID: "f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.343705 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.343737 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.343746 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmlsh\" (UniqueName: \"kubernetes.io/projected/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-kube-api-access-wmlsh\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.343757 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.769588 4962 generic.go:334] "Generic (PLEG): container finished" podID="a79e40bc-548c-44b8-8457-c0b8195c6436" containerID="5fc9ed21237586b0dba4b83e5d63c7c19a71fd2d4afd7ab0b4d8c03e1f54bb41" exitCode=0 Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.769684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xgdv7" event={"ID":"a79e40bc-548c-44b8-8457-c0b8195c6436","Type":"ContainerDied","Data":"5fc9ed21237586b0dba4b83e5d63c7c19a71fd2d4afd7ab0b4d8c03e1f54bb41"} Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.771576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrqgr" event={"ID":"f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56","Type":"ContainerDied","Data":"deb461a4f17a7f355c82858d1d7b043343d1b58137e6482bab4b080070c1fc31"} Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.771602 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb461a4f17a7f355c82858d1d7b043343d1b58137e6482bab4b080070c1fc31" Oct 03 13:12:44 crc kubenswrapper[4962]: I1003 13:12:44.771692 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrqgr" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.135731 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-f6hkh"] Oct 03 13:12:45 crc kubenswrapper[4962]: E1003 13:12:45.136137 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c59dc3-1749-4e12-aa01-09b6ed9934d8" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136159 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c59dc3-1749-4e12-aa01-09b6ed9934d8" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: E1003 13:12:45.136176 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" containerName="glance-db-sync" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136184 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" containerName="glance-db-sync" Oct 03 13:12:45 crc kubenswrapper[4962]: E1003 13:12:45.136206 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerName="init" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136214 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerName="init" Oct 03 13:12:45 crc kubenswrapper[4962]: E1003 13:12:45.136228 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b7fa90-3c0c-4c9d-9709-fdd39699b685" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136236 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b7fa90-3c0c-4c9d-9709-fdd39699b685" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: E1003 13:12:45.136249 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerName="dnsmasq-dns" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136256 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerName="dnsmasq-dns" Oct 03 13:12:45 crc kubenswrapper[4962]: E1003 13:12:45.136267 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e7ff309-1e7f-4954-a08b-a5cf72982735" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136274 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7ff309-1e7f-4954-a08b-a5cf72982735" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136487 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e7ff309-1e7f-4954-a08b-a5cf72982735" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136501 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c59dc3-1749-4e12-aa01-09b6ed9934d8" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136513 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" containerName="glance-db-sync" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136523 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed812fda-dbe5-48ff-bea4-3795a87d716f" containerName="dnsmasq-dns" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.136539 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b7fa90-3c0c-4c9d-9709-fdd39699b685" containerName="mariadb-database-create" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.140011 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.155420 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.155469 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.155489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.155538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzs6d\" (UniqueName: \"kubernetes.io/projected/f6b4e989-13b2-4a64-8620-e80920347ba5-kube-api-access-zzs6d\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.155620 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-config\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.155823 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.159785 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-f6hkh"] Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.257809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzs6d\" (UniqueName: \"kubernetes.io/projected/f6b4e989-13b2-4a64-8620-e80920347ba5-kube-api-access-zzs6d\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.257919 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-config\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.257987 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.258075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.258118 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.258139 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.261089 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-config\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.261691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.262275 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.265231 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.265294 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.282457 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzs6d\" (UniqueName: \"kubernetes.io/projected/f6b4e989-13b2-4a64-8620-e80920347ba5-kube-api-access-zzs6d\") pod \"dnsmasq-dns-7ff5475cc9-f6hkh\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.460199 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:45 crc kubenswrapper[4962]: I1003 13:12:45.942716 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-f6hkh"] Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.095656 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.195863 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlgfp\" (UniqueName: \"kubernetes.io/projected/a79e40bc-548c-44b8-8457-c0b8195c6436-kube-api-access-wlgfp\") pod \"a79e40bc-548c-44b8-8457-c0b8195c6436\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.196172 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-config-data\") pod \"a79e40bc-548c-44b8-8457-c0b8195c6436\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.196383 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-combined-ca-bundle\") pod \"a79e40bc-548c-44b8-8457-c0b8195c6436\" (UID: \"a79e40bc-548c-44b8-8457-c0b8195c6436\") " Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.200916 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79e40bc-548c-44b8-8457-c0b8195c6436-kube-api-access-wlgfp" (OuterVolumeSpecName: "kube-api-access-wlgfp") pod "a79e40bc-548c-44b8-8457-c0b8195c6436" (UID: "a79e40bc-548c-44b8-8457-c0b8195c6436"). InnerVolumeSpecName "kube-api-access-wlgfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.223837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a79e40bc-548c-44b8-8457-c0b8195c6436" (UID: "a79e40bc-548c-44b8-8457-c0b8195c6436"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.254244 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-config-data" (OuterVolumeSpecName: "config-data") pod "a79e40bc-548c-44b8-8457-c0b8195c6436" (UID: "a79e40bc-548c-44b8-8457-c0b8195c6436"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.298812 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.298853 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlgfp\" (UniqueName: \"kubernetes.io/projected/a79e40bc-548c-44b8-8457-c0b8195c6436-kube-api-access-wlgfp\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.298866 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79e40bc-548c-44b8-8457-c0b8195c6436-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.790898 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xgdv7" event={"ID":"a79e40bc-548c-44b8-8457-c0b8195c6436","Type":"ContainerDied","Data":"758317faceea078aa6b00f81960ad471ac9655d0afb89fcfc425972e9c32e679"} Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.790944 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758317faceea078aa6b00f81960ad471ac9655d0afb89fcfc425972e9c32e679" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.790966 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xgdv7" Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.792677 4962 generic.go:334] "Generic (PLEG): container finished" podID="f6b4e989-13b2-4a64-8620-e80920347ba5" containerID="4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5" exitCode=0 Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.792719 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" event={"ID":"f6b4e989-13b2-4a64-8620-e80920347ba5","Type":"ContainerDied","Data":"4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5"} Oct 03 13:12:46 crc kubenswrapper[4962]: I1003 13:12:46.792748 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" event={"ID":"f6b4e989-13b2-4a64-8620-e80920347ba5","Type":"ContainerStarted","Data":"083af7f397a9e758174acd53aa71a3b73c3d4845ca03c494fb35b3faab3283e7"} Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.039548 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-f6hkh"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.050096 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6n6zg"] Oct 03 13:12:47 crc kubenswrapper[4962]: E1003 13:12:47.050623 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79e40bc-548c-44b8-8457-c0b8195c6436" containerName="keystone-db-sync" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.050659 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79e40bc-548c-44b8-8457-c0b8195c6436" containerName="keystone-db-sync" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.050829 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79e40bc-548c-44b8-8457-c0b8195c6436" containerName="keystone-db-sync" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.051331 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.053732 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.053972 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rbx44" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.054075 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.064081 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6n6zg"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.067733 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.088450 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.095930 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.115507 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212497 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212547 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls8lf\" (UniqueName: \"kubernetes.io/projected/82810633-84f3-4644-910c-57d359ec2ac3-kube-api-access-ls8lf\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212574 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-credential-keys\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212596 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-fernet-keys\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212617 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-config-data\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212684 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-scripts\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212715 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212733 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212784 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-config\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212819 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-combined-ca-bundle\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.212841 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcm84\" (UniqueName: \"kubernetes.io/projected/78924dc1-86de-47ce-bfb7-d856e1d9e25a-kube-api-access-lcm84\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.220325 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.225720 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.231052 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.231245 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.239426 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315677 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-scripts\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315820 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315851 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-config\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315898 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-combined-ca-bundle\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315930 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcm84\" (UniqueName: \"kubernetes.io/projected/78924dc1-86de-47ce-bfb7-d856e1d9e25a-kube-api-access-lcm84\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315953 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.315980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls8lf\" (UniqueName: \"kubernetes.io/projected/82810633-84f3-4644-910c-57d359ec2ac3-kube-api-access-ls8lf\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.316009 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-credential-keys\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.316037 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-fernet-keys\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.316059 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-config-data\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.319776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-config\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.321716 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.321979 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.322454 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.322903 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.327325 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-credential-keys\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.329554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-config-data\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.336340 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-combined-ca-bundle\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.337563 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-fernet-keys\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.340022 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj"] Oct 03 13:12:47 crc kubenswrapper[4962]: E1003 13:12:47.340941 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ls8lf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" podUID="82810633-84f3-4644-910c-57d359ec2ac3" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.356877 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-scripts\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.378990 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcm84\" (UniqueName: \"kubernetes.io/projected/78924dc1-86de-47ce-bfb7-d856e1d9e25a-kube-api-access-lcm84\") pod \"keystone-bootstrap-6n6zg\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.379440 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2865v"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.380468 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.381567 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.383178 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.389934 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-cmqqg" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.389997 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.400470 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls8lf\" (UniqueName: \"kubernetes.io/projected/82810633-84f3-4644-910c-57d359ec2ac3-kube-api-access-ls8lf\") pod \"dnsmasq-dns-5c5cc7c5ff-6ngsj\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.412775 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2865v"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.418499 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-scripts\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.418559 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.418577 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-log-httpd\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.418605 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.418671 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-run-httpd\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.418695 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngmnt\" (UniqueName: \"kubernetes.io/projected/c09e1e8d-9d15-4eef-a553-2af7c59998e3-kube-api-access-ngmnt\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.418747 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-config-data\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.419770 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rk29c"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.421158 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.426270 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rk29c"] Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.525603 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.525871 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-scripts\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.525896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-config-data\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.525927 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-config\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.525946 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cg98\" (UniqueName: \"kubernetes.io/projected/ed39cf02-e47a-4dc8-be15-377a11c21af5-kube-api-access-7cg98\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.525972 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-config-data\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526102 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-scripts\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526149 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526164 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-log-httpd\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526347 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-run-httpd\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526371 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-combined-ca-bundle\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngmnt\" (UniqueName: \"kubernetes.io/projected/c09e1e8d-9d15-4eef-a553-2af7c59998e3-kube-api-access-ngmnt\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-logs\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ggj\" (UniqueName: \"kubernetes.io/projected/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-kube-api-access-z5ggj\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.526452 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.530506 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-log-httpd\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.530697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-config-data\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.530727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-run-httpd\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.534277 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-scripts\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.535498 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.538538 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.551060 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngmnt\" (UniqueName: \"kubernetes.io/projected/c09e1e8d-9d15-4eef-a553-2af7c59998e3-kube-api-access-ngmnt\") pod \"ceilometer-0\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.560849 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629542 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-config-data\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629600 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-config\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629624 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cg98\" (UniqueName: \"kubernetes.io/projected/ed39cf02-e47a-4dc8-be15-377a11c21af5-kube-api-access-7cg98\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629727 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-combined-ca-bundle\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629800 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-logs\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ggj\" (UniqueName: \"kubernetes.io/projected/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-kube-api-access-z5ggj\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629860 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.629882 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-scripts\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.631306 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.633525 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-scripts\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.634903 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-config-data\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.635351 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-logs\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.635740 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-config\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.635921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.636133 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-combined-ca-bundle\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.636261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.636618 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.651206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ggj\" (UniqueName: \"kubernetes.io/projected/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-kube-api-access-z5ggj\") pod \"placement-db-sync-2865v\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.651546 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cg98\" (UniqueName: \"kubernetes.io/projected/ed39cf02-e47a-4dc8-be15-377a11c21af5-kube-api-access-7cg98\") pod \"dnsmasq-dns-8b5c85b87-rk29c\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.804383 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.805664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" event={"ID":"f6b4e989-13b2-4a64-8620-e80920347ba5","Type":"ContainerStarted","Data":"37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc"} Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.805714 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.804739 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" podUID="f6b4e989-13b2-4a64-8620-e80920347ba5" containerName="dnsmasq-dns" containerID="cri-o://37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc" gracePeriod=10 Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.814874 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2865v" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.819683 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.825549 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.832714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls8lf\" (UniqueName: \"kubernetes.io/projected/82810633-84f3-4644-910c-57d359ec2ac3-kube-api-access-ls8lf\") pod \"82810633-84f3-4644-910c-57d359ec2ac3\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.832822 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-nb\") pod \"82810633-84f3-4644-910c-57d359ec2ac3\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.832857 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-config\") pod \"82810633-84f3-4644-910c-57d359ec2ac3\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.832892 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-svc\") pod \"82810633-84f3-4644-910c-57d359ec2ac3\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.832986 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-sb\") pod \"82810633-84f3-4644-910c-57d359ec2ac3\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.833031 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-swift-storage-0\") pod \"82810633-84f3-4644-910c-57d359ec2ac3\" (UID: \"82810633-84f3-4644-910c-57d359ec2ac3\") " Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.833472 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82810633-84f3-4644-910c-57d359ec2ac3" (UID: "82810633-84f3-4644-910c-57d359ec2ac3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.833584 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-config" (OuterVolumeSpecName: "config") pod "82810633-84f3-4644-910c-57d359ec2ac3" (UID: "82810633-84f3-4644-910c-57d359ec2ac3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.834060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "82810633-84f3-4644-910c-57d359ec2ac3" (UID: "82810633-84f3-4644-910c-57d359ec2ac3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.834103 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82810633-84f3-4644-910c-57d359ec2ac3" (UID: "82810633-84f3-4644-910c-57d359ec2ac3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.834259 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82810633-84f3-4644-910c-57d359ec2ac3" (UID: "82810633-84f3-4644-910c-57d359ec2ac3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.840825 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82810633-84f3-4644-910c-57d359ec2ac3-kube-api-access-ls8lf" (OuterVolumeSpecName: "kube-api-access-ls8lf") pod "82810633-84f3-4644-910c-57d359ec2ac3" (UID: "82810633-84f3-4644-910c-57d359ec2ac3"). InnerVolumeSpecName "kube-api-access-ls8lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.934959 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.934979 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls8lf\" (UniqueName: \"kubernetes.io/projected/82810633-84f3-4644-910c-57d359ec2ac3-kube-api-access-ls8lf\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.934989 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.934999 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.935007 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.935016 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82810633-84f3-4644-910c-57d359ec2ac3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.971156 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" podStartSLOduration=2.971135515 podStartE2EDuration="2.971135515s" podCreationTimestamp="2025-10-03 13:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:47.830014914 +0000 UTC m=+1376.233912749" watchObservedRunningTime="2025-10-03 13:12:47.971135515 +0000 UTC m=+1376.375033350" Oct 03 13:12:47 crc kubenswrapper[4962]: I1003 13:12:47.973293 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6n6zg"] Oct 03 13:12:48 crc kubenswrapper[4962]: W1003 13:12:48.004402 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78924dc1_86de_47ce_bfb7_d856e1d9e25a.slice/crio-da4d3ab506465a0f291bfb89c103c6964fddfccbafb4c805f563ba7ab6bc672c WatchSource:0}: Error finding container da4d3ab506465a0f291bfb89c103c6964fddfccbafb4c805f563ba7ab6bc672c: Status 404 returned error can't find the container with id da4d3ab506465a0f291bfb89c103c6964fddfccbafb4c805f563ba7ab6bc672c Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.049758 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.159848 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.161136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.166000 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s2g9d" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.166224 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.173423 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.173504 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.212973 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.214763 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.218762 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.218932 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.325013 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rk29c"] Oct 03 13:12:48 crc kubenswrapper[4962]: W1003 13:12:48.338885 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded39cf02_e47a_4dc8_be15_377a11c21af5.slice/crio-dcd85521f835fc743f3b0e52f3e2d4c47be69b7061996bd963ddf98274719513 WatchSource:0}: Error finding container dcd85521f835fc743f3b0e52f3e2d4c47be69b7061996bd963ddf98274719513: Status 404 returned error can't find the container with id dcd85521f835fc743f3b0e52f3e2d4c47be69b7061996bd963ddf98274719513 Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343373 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343398 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343418 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343444 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343467 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blw2v\" (UniqueName: \"kubernetes.io/projected/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-kube-api-access-blw2v\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8knl6\" (UniqueName: \"kubernetes.io/projected/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-kube-api-access-8knl6\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343572 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-logs\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343718 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.343740 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.395128 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4662-account-create-85cwj"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.407320 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4662-account-create-85cwj" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.410158 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.421138 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2865v"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.430448 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4662-account-create-85cwj"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.447994 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.449853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.449954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.450053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.450140 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blw2v\" (UniqueName: \"kubernetes.io/projected/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-kube-api-access-blw2v\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.450389 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.450528 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8knl6\" (UniqueName: \"kubernetes.io/projected/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-kube-api-access-8knl6\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.450622 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.450827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.454427 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.452375 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.448933 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.455427 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.455473 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.455752 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-logs\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.456477 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.456612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.461788 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.461973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.455995 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-logs\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.455794 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.468669 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.471098 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.471399 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.473893 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8knl6\" (UniqueName: \"kubernetes.io/projected/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-kube-api-access-8knl6\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.477976 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blw2v\" (UniqueName: \"kubernetes.io/projected/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-kube-api-access-blw2v\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.480132 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.495433 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.498315 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ace4-account-create-qg6md"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.500322 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ace4-account-create-qg6md" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.502962 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.517259 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ace4-account-create-qg6md"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.527859 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.534383 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.542940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.572354 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czklt\" (UniqueName: \"kubernetes.io/projected/72bb7aab-b5f7-46ba-bf39-471de4e5090f-kube-api-access-czklt\") pod \"cinder-4662-account-create-85cwj\" (UID: \"72bb7aab-b5f7-46ba-bf39-471de4e5090f\") " pod="openstack/cinder-4662-account-create-85cwj" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.583298 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.674946 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czklt\" (UniqueName: \"kubernetes.io/projected/72bb7aab-b5f7-46ba-bf39-471de4e5090f-kube-api-access-czklt\") pod \"cinder-4662-account-create-85cwj\" (UID: \"72bb7aab-b5f7-46ba-bf39-471de4e5090f\") " pod="openstack/cinder-4662-account-create-85cwj" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.675298 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldcr7\" (UniqueName: \"kubernetes.io/projected/6d5982a3-b38c-43bf-9edf-4e0216fb3374-kube-api-access-ldcr7\") pod \"barbican-ace4-account-create-qg6md\" (UID: \"6d5982a3-b38c-43bf-9edf-4e0216fb3374\") " pod="openstack/barbican-ace4-account-create-qg6md" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.698803 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czklt\" (UniqueName: \"kubernetes.io/projected/72bb7aab-b5f7-46ba-bf39-471de4e5090f-kube-api-access-czklt\") pod \"cinder-4662-account-create-85cwj\" (UID: \"72bb7aab-b5f7-46ba-bf39-471de4e5090f\") " pod="openstack/cinder-4662-account-create-85cwj" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.777231 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzs6d\" (UniqueName: \"kubernetes.io/projected/f6b4e989-13b2-4a64-8620-e80920347ba5-kube-api-access-zzs6d\") pod \"f6b4e989-13b2-4a64-8620-e80920347ba5\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.777290 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-config\") pod \"f6b4e989-13b2-4a64-8620-e80920347ba5\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.777388 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-svc\") pod \"f6b4e989-13b2-4a64-8620-e80920347ba5\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.777455 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-swift-storage-0\") pod \"f6b4e989-13b2-4a64-8620-e80920347ba5\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.777518 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-sb\") pod \"f6b4e989-13b2-4a64-8620-e80920347ba5\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.777557 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-nb\") pod \"f6b4e989-13b2-4a64-8620-e80920347ba5\" (UID: \"f6b4e989-13b2-4a64-8620-e80920347ba5\") " Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.777885 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldcr7\" (UniqueName: \"kubernetes.io/projected/6d5982a3-b38c-43bf-9edf-4e0216fb3374-kube-api-access-ldcr7\") pod \"barbican-ace4-account-create-qg6md\" (UID: \"6d5982a3-b38c-43bf-9edf-4e0216fb3374\") " pod="openstack/barbican-ace4-account-create-qg6md" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.800141 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b4e989-13b2-4a64-8620-e80920347ba5-kube-api-access-zzs6d" (OuterVolumeSpecName: "kube-api-access-zzs6d") pod "f6b4e989-13b2-4a64-8620-e80920347ba5" (UID: "f6b4e989-13b2-4a64-8620-e80920347ba5"). InnerVolumeSpecName "kube-api-access-zzs6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.804031 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.811919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldcr7\" (UniqueName: \"kubernetes.io/projected/6d5982a3-b38c-43bf-9edf-4e0216fb3374-kube-api-access-ldcr7\") pod \"barbican-ace4-account-create-qg6md\" (UID: \"6d5982a3-b38c-43bf-9edf-4e0216fb3374\") " pod="openstack/barbican-ace4-account-create-qg6md" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.814014 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8f54-account-create-c5m24"] Oct 03 13:12:48 crc kubenswrapper[4962]: E1003 13:12:48.814817 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b4e989-13b2-4a64-8620-e80920347ba5" containerName="init" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.814912 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b4e989-13b2-4a64-8620-e80920347ba5" containerName="init" Oct 03 13:12:48 crc kubenswrapper[4962]: E1003 13:12:48.815044 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b4e989-13b2-4a64-8620-e80920347ba5" containerName="dnsmasq-dns" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.815142 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b4e989-13b2-4a64-8620-e80920347ba5" containerName="dnsmasq-dns" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.816649 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b4e989-13b2-4a64-8620-e80920347ba5" containerName="dnsmasq-dns" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.818452 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8f54-account-create-c5m24" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.822018 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.833321 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8f54-account-create-c5m24"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.838732 4962 generic.go:334] "Generic (PLEG): container finished" podID="ed39cf02-e47a-4dc8-be15-377a11c21af5" containerID="f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a" exitCode=0 Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.838825 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" event={"ID":"ed39cf02-e47a-4dc8-be15-377a11c21af5","Type":"ContainerDied","Data":"f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a"} Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.838885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" event={"ID":"ed39cf02-e47a-4dc8-be15-377a11c21af5","Type":"ContainerStarted","Data":"dcd85521f835fc743f3b0e52f3e2d4c47be69b7061996bd963ddf98274719513"} Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.847728 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2865v" event={"ID":"29c86a63-d2f6-4f22-9d04-d6128fa7c31a","Type":"ContainerStarted","Data":"bf170b4d9b1b95848bfce23a5ebadb26e1f2453031bea79ef5a4e04ab63199da"} Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.858771 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e1e8d-9d15-4eef-a553-2af7c59998e3","Type":"ContainerStarted","Data":"90d5adf3763d27a649c2af61fad8cebac503244abf7c1c477777b6d017c259f7"} Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.870876 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f6b4e989-13b2-4a64-8620-e80920347ba5" (UID: "f6b4e989-13b2-4a64-8620-e80920347ba5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.871590 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-config" (OuterVolumeSpecName: "config") pod "f6b4e989-13b2-4a64-8620-e80920347ba5" (UID: "f6b4e989-13b2-4a64-8620-e80920347ba5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.875233 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6n6zg" event={"ID":"78924dc1-86de-47ce-bfb7-d856e1d9e25a","Type":"ContainerStarted","Data":"19508d7ab80c2b7b7b1fa204b25ec94f087e00b2aef79337e48cc4e04794ce38"} Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.876471 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6n6zg" event={"ID":"78924dc1-86de-47ce-bfb7-d856e1d9e25a","Type":"ContainerStarted","Data":"da4d3ab506465a0f291bfb89c103c6964fddfccbafb4c805f563ba7ab6bc672c"} Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.877823 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f6b4e989-13b2-4a64-8620-e80920347ba5" (UID: "f6b4e989-13b2-4a64-8620-e80920347ba5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.879511 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.879541 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.879554 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzs6d\" (UniqueName: \"kubernetes.io/projected/f6b4e989-13b2-4a64-8620-e80920347ba5-kube-api-access-zzs6d\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.879568 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.880546 4962 generic.go:334] "Generic (PLEG): container finished" podID="f6b4e989-13b2-4a64-8620-e80920347ba5" containerID="37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc" exitCode=0 Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.881095 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.881926 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.882140 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" event={"ID":"f6b4e989-13b2-4a64-8620-e80920347ba5","Type":"ContainerDied","Data":"37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc"} Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.882182 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-f6hkh" event={"ID":"f6b4e989-13b2-4a64-8620-e80920347ba5","Type":"ContainerDied","Data":"083af7f397a9e758174acd53aa71a3b73c3d4845ca03c494fb35b3faab3283e7"} Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.882205 4962 scope.go:117] "RemoveContainer" containerID="37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.883403 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f6b4e989-13b2-4a64-8620-e80920347ba5" (UID: "f6b4e989-13b2-4a64-8620-e80920347ba5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.891736 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f6b4e989-13b2-4a64-8620-e80920347ba5" (UID: "f6b4e989-13b2-4a64-8620-e80920347ba5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.899726 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6n6zg" podStartSLOduration=1.899705346 podStartE2EDuration="1.899705346s" podCreationTimestamp="2025-10-03 13:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:48.891843555 +0000 UTC m=+1377.295741400" watchObservedRunningTime="2025-10-03 13:12:48.899705346 +0000 UTC m=+1377.303603181" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.907104 4962 scope.go:117] "RemoveContainer" containerID="4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.930312 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ace4-account-create-qg6md" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.930809 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4662-account-create-85cwj" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.950352 4962 scope.go:117] "RemoveContainer" containerID="37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc" Oct 03 13:12:48 crc kubenswrapper[4962]: E1003 13:12:48.951034 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc\": container with ID starting with 37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc not found: ID does not exist" containerID="37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.951074 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc"} err="failed to get container status \"37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc\": rpc error: code = NotFound desc = could not find container \"37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc\": container with ID starting with 37a4a385f77bb70adb3eae3947095dcd5b59755629abe0487661c708ddbdb4dc not found: ID does not exist" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.951098 4962 scope.go:117] "RemoveContainer" containerID="4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5" Oct 03 13:12:48 crc kubenswrapper[4962]: E1003 13:12:48.951403 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5\": container with ID starting with 4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5 not found: ID does not exist" containerID="4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.951427 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5"} err="failed to get container status \"4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5\": rpc error: code = NotFound desc = could not find container \"4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5\": container with ID starting with 4d8292ae4dc470a2be3c4cea899d487a8d7562241e7954392291fd9bd6f758d5 not found: ID does not exist" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.957688 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.965356 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6ngsj"] Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.980716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnl58\" (UniqueName: \"kubernetes.io/projected/e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6-kube-api-access-bnl58\") pod \"neutron-8f54-account-create-c5m24\" (UID: \"e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6\") " pod="openstack/neutron-8f54-account-create-c5m24" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.981036 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:48 crc kubenswrapper[4962]: I1003 13:12:48.981052 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6b4e989-13b2-4a64-8620-e80920347ba5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.085387 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnl58\" (UniqueName: \"kubernetes.io/projected/e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6-kube-api-access-bnl58\") pod \"neutron-8f54-account-create-c5m24\" (UID: \"e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6\") " pod="openstack/neutron-8f54-account-create-c5m24" Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.127262 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnl58\" (UniqueName: \"kubernetes.io/projected/e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6-kube-api-access-bnl58\") pod \"neutron-8f54-account-create-c5m24\" (UID: \"e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6\") " pod="openstack/neutron-8f54-account-create-c5m24" Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.128366 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:49 crc kubenswrapper[4962]: W1003 13:12:49.143048 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ad6e2a5_3f48_4cbc_ab34_7054f8c253ee.slice/crio-858b3c09d9992b5ecb466e8d9f01ac69f0037e5f797fe2b85f390d3a32dc3495 WatchSource:0}: Error finding container 858b3c09d9992b5ecb466e8d9f01ac69f0037e5f797fe2b85f390d3a32dc3495: Status 404 returned error can't find the container with id 858b3c09d9992b5ecb466e8d9f01ac69f0037e5f797fe2b85f390d3a32dc3495 Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.148460 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8f54-account-create-c5m24" Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.230456 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-f6hkh"] Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.238089 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-f6hkh"] Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.481507 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ace4-account-create-qg6md"] Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.506982 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:49 crc kubenswrapper[4962]: W1003 13:12:49.517963 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d5982a3_b38c_43bf_9edf_4e0216fb3374.slice/crio-0de35b55430f1da8ed92049ed12077d2467f1dc3406e8a6b16bed849c33959bb WatchSource:0}: Error finding container 0de35b55430f1da8ed92049ed12077d2467f1dc3406e8a6b16bed849c33959bb: Status 404 returned error can't find the container with id 0de35b55430f1da8ed92049ed12077d2467f1dc3406e8a6b16bed849c33959bb Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.575069 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4662-account-create-85cwj"] Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.679858 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8f54-account-create-c5m24"] Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.899149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4662-account-create-85cwj" event={"ID":"72bb7aab-b5f7-46ba-bf39-471de4e5090f","Type":"ContainerStarted","Data":"924e75b0c572933e17c836913b70f13e95108299636002de9aa1b6ed0025fff6"} Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.903735 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee","Type":"ContainerStarted","Data":"858b3c09d9992b5ecb466e8d9f01ac69f0037e5f797fe2b85f390d3a32dc3495"} Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.932145 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" event={"ID":"ed39cf02-e47a-4dc8-be15-377a11c21af5","Type":"ContainerStarted","Data":"76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3"} Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.932209 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.951444 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8f54-account-create-c5m24" event={"ID":"e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6","Type":"ContainerStarted","Data":"ea1ce86bcf4c67c230cbb6e12c9d8f0db632261f4fc7b4457f150cc0bcb370d1"} Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.966239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ace4-account-create-qg6md" event={"ID":"6d5982a3-b38c-43bf-9edf-4e0216fb3374","Type":"ContainerStarted","Data":"0de35b55430f1da8ed92049ed12077d2467f1dc3406e8a6b16bed849c33959bb"} Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.968389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea","Type":"ContainerStarted","Data":"9a8bb459a4c8dc70c7215047bd10043e04f830f51e21110f1aab080f0aed9e11"} Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.971925 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" podStartSLOduration=2.971904266 podStartE2EDuration="2.971904266s" podCreationTimestamp="2025-10-03 13:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:49.958504966 +0000 UTC m=+1378.362402801" watchObservedRunningTime="2025-10-03 13:12:49.971904266 +0000 UTC m=+1378.375802101" Oct 03 13:12:49 crc kubenswrapper[4962]: I1003 13:12:49.983799 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-ace4-account-create-qg6md" podStartSLOduration=1.983778445 podStartE2EDuration="1.983778445s" podCreationTimestamp="2025-10-03 13:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:49.983098727 +0000 UTC m=+1378.386996562" watchObservedRunningTime="2025-10-03 13:12:49.983778445 +0000 UTC m=+1378.387676290" Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.240283 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82810633-84f3-4644-910c-57d359ec2ac3" path="/var/lib/kubelet/pods/82810633-84f3-4644-910c-57d359ec2ac3/volumes" Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.240790 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b4e989-13b2-4a64-8620-e80920347ba5" path="/var/lib/kubelet/pods/f6b4e989-13b2-4a64-8620-e80920347ba5/volumes" Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.978320 4962 generic.go:334] "Generic (PLEG): container finished" podID="e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6" containerID="747670b96bfa0a7f1ae526363ef27c6298cb058f6c1b63ac1f4c8ed4ccec39e3" exitCode=0 Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.978573 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8f54-account-create-c5m24" event={"ID":"e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6","Type":"ContainerDied","Data":"747670b96bfa0a7f1ae526363ef27c6298cb058f6c1b63ac1f4c8ed4ccec39e3"} Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.981149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea","Type":"ContainerStarted","Data":"4420135f0a4f8fcafd8ca642a3ef3fed14dd906441be8c3b82009ed4daddf179"} Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.981181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea","Type":"ContainerStarted","Data":"4e3e83e00cb6e2cf785955ae1f1db43c9b118ef26fe4f44ee2f206cd92f1ab3d"} Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.984151 4962 generic.go:334] "Generic (PLEG): container finished" podID="6d5982a3-b38c-43bf-9edf-4e0216fb3374" containerID="4c564ec0aa0ac250e494fc64a11e1161b3ca57f40622bafc69fe02e54a49d547" exitCode=0 Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.984187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ace4-account-create-qg6md" event={"ID":"6d5982a3-b38c-43bf-9edf-4e0216fb3374","Type":"ContainerDied","Data":"4c564ec0aa0ac250e494fc64a11e1161b3ca57f40622bafc69fe02e54a49d547"} Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.986440 4962 generic.go:334] "Generic (PLEG): container finished" podID="72bb7aab-b5f7-46ba-bf39-471de4e5090f" containerID="80923de5c19f7d65b29bc15584394759409c24eacccb07a53c15f115d22dfe78" exitCode=0 Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.986474 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4662-account-create-85cwj" event={"ID":"72bb7aab-b5f7-46ba-bf39-471de4e5090f","Type":"ContainerDied","Data":"80923de5c19f7d65b29bc15584394759409c24eacccb07a53c15f115d22dfe78"} Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.994778 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee","Type":"ContainerStarted","Data":"0442c8ea4fb01cd0a7571ebd0c4741902c3adbd64625faaa9942f822b037d8a0"} Oct 03 13:12:50 crc kubenswrapper[4962]: I1003 13:12:50.994862 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee","Type":"ContainerStarted","Data":"f0f8ab17008a39c843b4ec7fec174eb3ff5bc9c8390d934211bf67ad0ecf929c"} Oct 03 13:12:51 crc kubenswrapper[4962]: I1003 13:12:51.037415 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.037396455 podStartE2EDuration="4.037396455s" podCreationTimestamp="2025-10-03 13:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:51.028215369 +0000 UTC m=+1379.432113204" watchObservedRunningTime="2025-10-03 13:12:51.037396455 +0000 UTC m=+1379.441294290" Oct 03 13:12:51 crc kubenswrapper[4962]: I1003 13:12:51.096721 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.096698998 podStartE2EDuration="4.096698998s" podCreationTimestamp="2025-10-03 13:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:51.089042823 +0000 UTC m=+1379.492940658" watchObservedRunningTime="2025-10-03 13:12:51.096698998 +0000 UTC m=+1379.500596833" Oct 03 13:12:51 crc kubenswrapper[4962]: I1003 13:12:51.321829 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:51 crc kubenswrapper[4962]: I1003 13:12:51.335137 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:12:51 crc kubenswrapper[4962]: I1003 13:12:51.389134 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:52 crc kubenswrapper[4962]: I1003 13:12:52.953414 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8f54-account-create-c5m24" Oct 03 13:12:52 crc kubenswrapper[4962]: I1003 13:12:52.960716 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ace4-account-create-qg6md" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.017413 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8f54-account-create-c5m24" event={"ID":"e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6","Type":"ContainerDied","Data":"ea1ce86bcf4c67c230cbb6e12c9d8f0db632261f4fc7b4457f150cc0bcb370d1"} Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.017479 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea1ce86bcf4c67c230cbb6e12c9d8f0db632261f4fc7b4457f150cc0bcb370d1" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.017503 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8f54-account-create-c5m24" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.022628 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ace4-account-create-qg6md" event={"ID":"6d5982a3-b38c-43bf-9edf-4e0216fb3374","Type":"ContainerDied","Data":"0de35b55430f1da8ed92049ed12077d2467f1dc3406e8a6b16bed849c33959bb"} Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.023076 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de35b55430f1da8ed92049ed12077d2467f1dc3406e8a6b16bed849c33959bb" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.023102 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ace4-account-create-qg6md" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.024884 4962 generic.go:334] "Generic (PLEG): container finished" podID="78924dc1-86de-47ce-bfb7-d856e1d9e25a" containerID="19508d7ab80c2b7b7b1fa204b25ec94f087e00b2aef79337e48cc4e04794ce38" exitCode=0 Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.024956 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6n6zg" event={"ID":"78924dc1-86de-47ce-bfb7-d856e1d9e25a","Type":"ContainerDied","Data":"19508d7ab80c2b7b7b1fa204b25ec94f087e00b2aef79337e48cc4e04794ce38"} Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.025086 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerName="glance-log" containerID="cri-o://4e3e83e00cb6e2cf785955ae1f1db43c9b118ef26fe4f44ee2f206cd92f1ab3d" gracePeriod=30 Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.025192 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerName="glance-httpd" containerID="cri-o://4420135f0a4f8fcafd8ca642a3ef3fed14dd906441be8c3b82009ed4daddf179" gracePeriod=30 Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.025384 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerName="glance-log" containerID="cri-o://f0f8ab17008a39c843b4ec7fec174eb3ff5bc9c8390d934211bf67ad0ecf929c" gracePeriod=30 Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.025483 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerName="glance-httpd" containerID="cri-o://0442c8ea4fb01cd0a7571ebd0c4741902c3adbd64625faaa9942f822b037d8a0" gracePeriod=30 Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.094971 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnl58\" (UniqueName: \"kubernetes.io/projected/e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6-kube-api-access-bnl58\") pod \"e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6\" (UID: \"e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6\") " Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.095065 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldcr7\" (UniqueName: \"kubernetes.io/projected/6d5982a3-b38c-43bf-9edf-4e0216fb3374-kube-api-access-ldcr7\") pod \"6d5982a3-b38c-43bf-9edf-4e0216fb3374\" (UID: \"6d5982a3-b38c-43bf-9edf-4e0216fb3374\") " Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.100961 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6-kube-api-access-bnl58" (OuterVolumeSpecName: "kube-api-access-bnl58") pod "e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6" (UID: "e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6"). InnerVolumeSpecName "kube-api-access-bnl58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.101021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5982a3-b38c-43bf-9edf-4e0216fb3374-kube-api-access-ldcr7" (OuterVolumeSpecName: "kube-api-access-ldcr7") pod "6d5982a3-b38c-43bf-9edf-4e0216fb3374" (UID: "6d5982a3-b38c-43bf-9edf-4e0216fb3374"). InnerVolumeSpecName "kube-api-access-ldcr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.200572 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldcr7\" (UniqueName: \"kubernetes.io/projected/6d5982a3-b38c-43bf-9edf-4e0216fb3374-kube-api-access-ldcr7\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.200622 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnl58\" (UniqueName: \"kubernetes.io/projected/e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6-kube-api-access-bnl58\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.927228 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-2wz97"] Oct 03 13:12:53 crc kubenswrapper[4962]: E1003 13:12:53.927944 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6" containerName="mariadb-account-create" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.927961 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6" containerName="mariadb-account-create" Oct 03 13:12:53 crc kubenswrapper[4962]: E1003 13:12:53.927991 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5982a3-b38c-43bf-9edf-4e0216fb3374" containerName="mariadb-account-create" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.927998 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5982a3-b38c-43bf-9edf-4e0216fb3374" containerName="mariadb-account-create" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.929113 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6" containerName="mariadb-account-create" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.929139 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5982a3-b38c-43bf-9edf-4e0216fb3374" containerName="mariadb-account-create" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.933340 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.937369 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.937930 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pk28p" Oct 03 13:12:53 crc kubenswrapper[4962]: I1003 13:12:53.958022 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2wz97"] Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.014501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-combined-ca-bundle\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.014567 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28j99\" (UniqueName: \"kubernetes.io/projected/22874ecf-641f-46a1-bbb5-4d27b38bf001-kube-api-access-28j99\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.014587 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-db-sync-config-data\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.037445 4962 generic.go:334] "Generic (PLEG): container finished" podID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerID="4420135f0a4f8fcafd8ca642a3ef3fed14dd906441be8c3b82009ed4daddf179" exitCode=0 Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.037487 4962 generic.go:334] "Generic (PLEG): container finished" podID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerID="4e3e83e00cb6e2cf785955ae1f1db43c9b118ef26fe4f44ee2f206cd92f1ab3d" exitCode=143 Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.037544 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea","Type":"ContainerDied","Data":"4420135f0a4f8fcafd8ca642a3ef3fed14dd906441be8c3b82009ed4daddf179"} Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.037577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea","Type":"ContainerDied","Data":"4e3e83e00cb6e2cf785955ae1f1db43c9b118ef26fe4f44ee2f206cd92f1ab3d"} Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.040108 4962 generic.go:334] "Generic (PLEG): container finished" podID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerID="0442c8ea4fb01cd0a7571ebd0c4741902c3adbd64625faaa9942f822b037d8a0" exitCode=0 Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.040130 4962 generic.go:334] "Generic (PLEG): container finished" podID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerID="f0f8ab17008a39c843b4ec7fec174eb3ff5bc9c8390d934211bf67ad0ecf929c" exitCode=143 Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.040177 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee","Type":"ContainerDied","Data":"0442c8ea4fb01cd0a7571ebd0c4741902c3adbd64625faaa9942f822b037d8a0"} Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.040207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee","Type":"ContainerDied","Data":"f0f8ab17008a39c843b4ec7fec174eb3ff5bc9c8390d934211bf67ad0ecf929c"} Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.116445 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-combined-ca-bundle\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.116528 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28j99\" (UniqueName: \"kubernetes.io/projected/22874ecf-641f-46a1-bbb5-4d27b38bf001-kube-api-access-28j99\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.116558 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-db-sync-config-data\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.121326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-combined-ca-bundle\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.122181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-db-sync-config-data\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.134651 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28j99\" (UniqueName: \"kubernetes.io/projected/22874ecf-641f-46a1-bbb5-4d27b38bf001-kube-api-access-28j99\") pod \"barbican-db-sync-2wz97\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:54 crc kubenswrapper[4962]: I1003 13:12:54.252989 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2wz97" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.049516 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6n6zg" event={"ID":"78924dc1-86de-47ce-bfb7-d856e1d9e25a","Type":"ContainerDied","Data":"da4d3ab506465a0f291bfb89c103c6964fddfccbafb4c805f563ba7ab6bc672c"} Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.049876 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4d3ab506465a0f291bfb89c103c6964fddfccbafb4c805f563ba7ab6bc672c" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.080773 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.234177 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-fernet-keys\") pod \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.234325 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcm84\" (UniqueName: \"kubernetes.io/projected/78924dc1-86de-47ce-bfb7-d856e1d9e25a-kube-api-access-lcm84\") pod \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.234376 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-credential-keys\") pod \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.234516 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-combined-ca-bundle\") pod \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.235194 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-config-data\") pod \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.235228 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-scripts\") pod \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\" (UID: \"78924dc1-86de-47ce-bfb7-d856e1d9e25a\") " Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.242390 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "78924dc1-86de-47ce-bfb7-d856e1d9e25a" (UID: "78924dc1-86de-47ce-bfb7-d856e1d9e25a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.243463 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78924dc1-86de-47ce-bfb7-d856e1d9e25a-kube-api-access-lcm84" (OuterVolumeSpecName: "kube-api-access-lcm84") pod "78924dc1-86de-47ce-bfb7-d856e1d9e25a" (UID: "78924dc1-86de-47ce-bfb7-d856e1d9e25a"). InnerVolumeSpecName "kube-api-access-lcm84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.253390 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-scripts" (OuterVolumeSpecName: "scripts") pod "78924dc1-86de-47ce-bfb7-d856e1d9e25a" (UID: "78924dc1-86de-47ce-bfb7-d856e1d9e25a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.253796 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "78924dc1-86de-47ce-bfb7-d856e1d9e25a" (UID: "78924dc1-86de-47ce-bfb7-d856e1d9e25a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.268474 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78924dc1-86de-47ce-bfb7-d856e1d9e25a" (UID: "78924dc1-86de-47ce-bfb7-d856e1d9e25a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.280022 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-config-data" (OuterVolumeSpecName: "config-data") pod "78924dc1-86de-47ce-bfb7-d856e1d9e25a" (UID: "78924dc1-86de-47ce-bfb7-d856e1d9e25a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.337702 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.337741 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.337780 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.337792 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.337803 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcm84\" (UniqueName: \"kubernetes.io/projected/78924dc1-86de-47ce-bfb7-d856e1d9e25a-kube-api-access-lcm84\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.337815 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78924dc1-86de-47ce-bfb7-d856e1d9e25a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.626819 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4662-account-create-85cwj" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.744477 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czklt\" (UniqueName: \"kubernetes.io/projected/72bb7aab-b5f7-46ba-bf39-471de4e5090f-kube-api-access-czklt\") pod \"72bb7aab-b5f7-46ba-bf39-471de4e5090f\" (UID: \"72bb7aab-b5f7-46ba-bf39-471de4e5090f\") " Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.752000 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72bb7aab-b5f7-46ba-bf39-471de4e5090f-kube-api-access-czklt" (OuterVolumeSpecName: "kube-api-access-czklt") pod "72bb7aab-b5f7-46ba-bf39-471de4e5090f" (UID: "72bb7aab-b5f7-46ba-bf39-471de4e5090f"). InnerVolumeSpecName "kube-api-access-czklt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.847251 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czklt\" (UniqueName: \"kubernetes.io/projected/72bb7aab-b5f7-46ba-bf39-471de4e5090f-kube-api-access-czklt\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.903844 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:12:55 crc kubenswrapper[4962]: I1003 13:12:55.970316 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.050749 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blw2v\" (UniqueName: \"kubernetes.io/projected/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-kube-api-access-blw2v\") pod \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.050813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.050858 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-config-data\") pod \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.050992 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-logs\") pod \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051025 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-config-data\") pod \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051065 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-logs\") pod \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051095 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8knl6\" (UniqueName: \"kubernetes.io/projected/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-kube-api-access-8knl6\") pod \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051134 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-combined-ca-bundle\") pod \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-combined-ca-bundle\") pod \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-httpd-run\") pod \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-httpd-run\") pod \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051269 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-scripts\") pod \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051301 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\" (UID: \"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.051329 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-scripts\") pod \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\" (UID: \"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea\") " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.053059 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" (UID: "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.053659 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-logs" (OuterVolumeSpecName: "logs") pod "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" (UID: "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.053772 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" (UID: "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.057220 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-kube-api-access-8knl6" (OuterVolumeSpecName: "kube-api-access-8knl6") pod "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" (UID: "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee"). InnerVolumeSpecName "kube-api-access-8knl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.057679 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-kube-api-access-blw2v" (OuterVolumeSpecName: "kube-api-access-blw2v") pod "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" (UID: "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea"). InnerVolumeSpecName "kube-api-access-blw2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.057794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-logs" (OuterVolumeSpecName: "logs") pod "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" (UID: "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.058813 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" (UID: "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.060190 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" (UID: "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.060547 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-scripts" (OuterVolumeSpecName: "scripts") pod "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" (UID: "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.060792 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-scripts" (OuterVolumeSpecName: "scripts") pod "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" (UID: "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.062911 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee","Type":"ContainerDied","Data":"858b3c09d9992b5ecb466e8d9f01ac69f0037e5f797fe2b85f390d3a32dc3495"} Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.062960 4962 scope.go:117] "RemoveContainer" containerID="0442c8ea4fb01cd0a7571ebd0c4741902c3adbd64625faaa9942f822b037d8a0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.063071 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.066684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e1e8d-9d15-4eef-a553-2af7c59998e3","Type":"ContainerStarted","Data":"922457fe5519a12a1f756b9f0601dc6bb170b210e8c7fcb0a9bfa22bb5817803"} Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.068261 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4662-account-create-85cwj" event={"ID":"72bb7aab-b5f7-46ba-bf39-471de4e5090f","Type":"ContainerDied","Data":"924e75b0c572933e17c836913b70f13e95108299636002de9aa1b6ed0025fff6"} Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.068288 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924e75b0c572933e17c836913b70f13e95108299636002de9aa1b6ed0025fff6" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.068326 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4662-account-create-85cwj" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.074886 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2865v" event={"ID":"29c86a63-d2f6-4f22-9d04-d6128fa7c31a","Type":"ContainerStarted","Data":"dd4e722d7156a1a3702c8352cc07fe685ce94d9af20fd5e713aa19eeae622754"} Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.078661 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" (UID: "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.079764 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6n6zg" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.079868 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.079975 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea","Type":"ContainerDied","Data":"9a8bb459a4c8dc70c7215047bd10043e04f830f51e21110f1aab080f0aed9e11"} Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.093396 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2865v" podStartSLOduration=1.931362427 podStartE2EDuration="9.093379188s" podCreationTimestamp="2025-10-03 13:12:47 +0000 UTC" firstStartedPulling="2025-10-03 13:12:48.449830913 +0000 UTC m=+1376.853728748" lastFinishedPulling="2025-10-03 13:12:55.611847674 +0000 UTC m=+1384.015745509" observedRunningTime="2025-10-03 13:12:56.089809872 +0000 UTC m=+1384.493707717" watchObservedRunningTime="2025-10-03 13:12:56.093379188 +0000 UTC m=+1384.497277023" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.098495 4962 scope.go:117] "RemoveContainer" containerID="f0f8ab17008a39c843b4ec7fec174eb3ff5bc9c8390d934211bf67ad0ecf929c" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.099987 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" (UID: "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.119172 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-config-data" (OuterVolumeSpecName: "config-data") pod "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" (UID: "0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.129093 4962 scope.go:117] "RemoveContainer" containerID="4420135f0a4f8fcafd8ca642a3ef3fed14dd906441be8c3b82009ed4daddf179" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.136724 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-config-data" (OuterVolumeSpecName: "config-data") pod "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" (UID: "5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154761 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154801 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blw2v\" (UniqueName: \"kubernetes.io/projected/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-kube-api-access-blw2v\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154821 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154833 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154842 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154850 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154858 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154867 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8knl6\" (UniqueName: \"kubernetes.io/projected/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-kube-api-access-8knl6\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154875 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154883 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154891 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154901 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154910 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154924 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.154993 4962 scope.go:117] "RemoveContainer" containerID="4e3e83e00cb6e2cf785955ae1f1db43c9b118ef26fe4f44ee2f206cd92f1ab3d" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.155644 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2wz97"] Oct 03 13:12:56 crc kubenswrapper[4962]: W1003 13:12:56.158111 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22874ecf_641f_46a1_bbb5_4d27b38bf001.slice/crio-e79484f6c9ffa7b36c22c8e5b8e9340a243a48c41f6bd2eb2325c3acdd6cfc92 WatchSource:0}: Error finding container e79484f6c9ffa7b36c22c8e5b8e9340a243a48c41f6bd2eb2325c3acdd6cfc92: Status 404 returned error can't find the container with id e79484f6c9ffa7b36c22c8e5b8e9340a243a48c41f6bd2eb2325c3acdd6cfc92 Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.172480 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.181847 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.245727 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6n6zg"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.259376 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6n6zg"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.265505 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.265850 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.343661 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4fvqx"] Oct 03 13:12:56 crc kubenswrapper[4962]: E1003 13:12:56.343997 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bb7aab-b5f7-46ba-bf39-471de4e5090f" containerName="mariadb-account-create" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344011 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bb7aab-b5f7-46ba-bf39-471de4e5090f" containerName="mariadb-account-create" Oct 03 13:12:56 crc kubenswrapper[4962]: E1003 13:12:56.344041 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerName="glance-httpd" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344047 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerName="glance-httpd" Oct 03 13:12:56 crc kubenswrapper[4962]: E1003 13:12:56.344060 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerName="glance-log" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344066 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerName="glance-log" Oct 03 13:12:56 crc kubenswrapper[4962]: E1003 13:12:56.344075 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerName="glance-httpd" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344081 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerName="glance-httpd" Oct 03 13:12:56 crc kubenswrapper[4962]: E1003 13:12:56.344100 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78924dc1-86de-47ce-bfb7-d856e1d9e25a" containerName="keystone-bootstrap" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344106 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="78924dc1-86de-47ce-bfb7-d856e1d9e25a" containerName="keystone-bootstrap" Oct 03 13:12:56 crc kubenswrapper[4962]: E1003 13:12:56.344122 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerName="glance-log" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344128 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerName="glance-log" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344328 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bb7aab-b5f7-46ba-bf39-471de4e5090f" containerName="mariadb-account-create" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344340 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="78924dc1-86de-47ce-bfb7-d856e1d9e25a" containerName="keystone-bootstrap" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344350 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerName="glance-httpd" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344361 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerName="glance-log" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344375 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" containerName="glance-httpd" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344385 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" containerName="glance-log" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.344959 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.349550 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.349910 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.350621 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rbx44" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.351244 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.363103 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4fvqx"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.399507 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.410668 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.427452 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.434704 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.442614 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.446359 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.449673 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.449916 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s2g9d" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.450027 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.454919 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.458813 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.467655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-credential-keys\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.467741 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-combined-ca-bundle\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.467762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v846c\" (UniqueName: \"kubernetes.io/projected/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-kube-api-access-v846c\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.467787 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-fernet-keys\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.467816 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-config-data\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.467840 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-scripts\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.469132 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.470604 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.478015 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.478263 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.503400 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.571947 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-config-data\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572032 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-config-data\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-scripts\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572084 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l4pr\" (UniqueName: \"kubernetes.io/projected/c36ae572-9009-4126-88fa-a27e232e4332-kube-api-access-9l4pr\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572111 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572134 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572150 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572179 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572194 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572212 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572232 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-scripts\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572247 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572265 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-credential-keys\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572282 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-logs\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572311 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572333 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572380 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tl94\" (UniqueName: \"kubernetes.io/projected/8ef04f91-585b-4d7a-8847-c83ec719fa7f-kube-api-access-5tl94\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572406 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-combined-ca-bundle\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572421 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v846c\" (UniqueName: \"kubernetes.io/projected/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-kube-api-access-v846c\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.572446 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-fernet-keys\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.580302 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-scripts\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.580715 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-credential-keys\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.580893 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-fernet-keys\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.580931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-config-data\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.584302 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-combined-ca-bundle\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.596618 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v846c\" (UniqueName: \"kubernetes.io/projected/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-kube-api-access-v846c\") pod \"keystone-bootstrap-4fvqx\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.673255 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.673798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.673864 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.673889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.673932 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.673954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.673980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674005 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-scripts\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-logs\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674102 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674129 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tl94\" (UniqueName: \"kubernetes.io/projected/8ef04f91-585b-4d7a-8847-c83ec719fa7f-kube-api-access-5tl94\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674235 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674253 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674287 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-config-data\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l4pr\" (UniqueName: \"kubernetes.io/projected/c36ae572-9009-4126-88fa-a27e232e4332-kube-api-access-9l4pr\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.674520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.675339 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.678321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.678485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.678523 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-scripts\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.678519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.678693 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.678847 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.678896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-logs\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.679155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.683128 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-config-data\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.689939 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.693277 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.693427 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l4pr\" (UniqueName: \"kubernetes.io/projected/c36ae572-9009-4126-88fa-a27e232e4332-kube-api-access-9l4pr\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.694617 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tl94\" (UniqueName: \"kubernetes.io/projected/8ef04f91-585b-4d7a-8847-c83ec719fa7f-kube-api-access-5tl94\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.708204 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " pod="openstack/glance-default-external-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.757838 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.783749 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:12:56 crc kubenswrapper[4962]: I1003 13:12:56.797583 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:12:57 crc kubenswrapper[4962]: I1003 13:12:57.095225 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2wz97" event={"ID":"22874ecf-641f-46a1-bbb5-4d27b38bf001","Type":"ContainerStarted","Data":"e79484f6c9ffa7b36c22c8e5b8e9340a243a48c41f6bd2eb2325c3acdd6cfc92"} Oct 03 13:12:57 crc kubenswrapper[4962]: I1003 13:12:57.151282 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4fvqx"] Oct 03 13:12:57 crc kubenswrapper[4962]: I1003 13:12:57.586083 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:12:57 crc kubenswrapper[4962]: I1003 13:12:57.643726 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:12:57 crc kubenswrapper[4962]: W1003 13:12:57.764715 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ef04f91_585b_4d7a_8847_c83ec719fa7f.slice/crio-4e862bef3957222a2a9336bde6e4080b19d7312e422bd0524279be5ec6432c2b WatchSource:0}: Error finding container 4e862bef3957222a2a9336bde6e4080b19d7312e422bd0524279be5ec6432c2b: Status 404 returned error can't find the container with id 4e862bef3957222a2a9336bde6e4080b19d7312e422bd0524279be5ec6432c2b Oct 03 13:12:57 crc kubenswrapper[4962]: W1003 13:12:57.774857 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e1d50ce_6e3a_4c9f_ae0a_37e698dcc693.slice/crio-1a5fa6bab2c349a7c7cda213ca1256a9045ab179d48b3a17b59cbfa6cfaeeeab WatchSource:0}: Error finding container 1a5fa6bab2c349a7c7cda213ca1256a9045ab179d48b3a17b59cbfa6cfaeeeab: Status 404 returned error can't find the container with id 1a5fa6bab2c349a7c7cda213ca1256a9045ab179d48b3a17b59cbfa6cfaeeeab Oct 03 13:12:57 crc kubenswrapper[4962]: I1003 13:12:57.827758 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:12:57 crc kubenswrapper[4962]: I1003 13:12:57.889492 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-t8t49"] Oct 03 13:12:57 crc kubenswrapper[4962]: I1003 13:12:57.891615 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" podUID="493335aa-ea8d-4197-8c8b-af186d99f4aa" containerName="dnsmasq-dns" containerID="cri-o://4dca5aa90c57dbb61b1e0f0cfad418d1058fe0f35307c84e272b53f55c58252e" gracePeriod=10 Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.128255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ef04f91-585b-4d7a-8847-c83ec719fa7f","Type":"ContainerStarted","Data":"4e862bef3957222a2a9336bde6e4080b19d7312e422bd0524279be5ec6432c2b"} Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.129276 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c36ae572-9009-4126-88fa-a27e232e4332","Type":"ContainerStarted","Data":"4337aa2673fa3d3964e58b850142a7a4a4fadcdf2ed67191370b0473cc88bbbc"} Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.130899 4962 generic.go:334] "Generic (PLEG): container finished" podID="493335aa-ea8d-4197-8c8b-af186d99f4aa" containerID="4dca5aa90c57dbb61b1e0f0cfad418d1058fe0f35307c84e272b53f55c58252e" exitCode=0 Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.130943 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" event={"ID":"493335aa-ea8d-4197-8c8b-af186d99f4aa","Type":"ContainerDied","Data":"4dca5aa90c57dbb61b1e0f0cfad418d1058fe0f35307c84e272b53f55c58252e"} Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.136468 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4fvqx" event={"ID":"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693","Type":"ContainerStarted","Data":"46e6da82e57b34e2bc7a9d03f73558235f6511a392ce8f5cec53b37074e329af"} Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.136508 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4fvqx" event={"ID":"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693","Type":"ContainerStarted","Data":"1a5fa6bab2c349a7c7cda213ca1256a9045ab179d48b3a17b59cbfa6cfaeeeab"} Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.141102 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e1e8d-9d15-4eef-a553-2af7c59998e3","Type":"ContainerStarted","Data":"4fc87af0adc8de378b2e85921a90879f9ab1b63fd92d450de2af297793e6c95b"} Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.173799 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4fvqx" podStartSLOduration=2.173740947 podStartE2EDuration="2.173740947s" podCreationTimestamp="2025-10-03 13:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:12:58.156184626 +0000 UTC m=+1386.560082461" watchObservedRunningTime="2025-10-03 13:12:58.173740947 +0000 UTC m=+1386.577638782" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.282478 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea" path="/var/lib/kubelet/pods/0b10e6b3-04d6-48a3-a745-f7d53e5aa3ea/volumes" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.283364 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee" path="/var/lib/kubelet/pods/5ad6e2a5-3f48-4cbc-ab34-7054f8c253ee/volumes" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.284175 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78924dc1-86de-47ce-bfb7-d856e1d9e25a" path="/var/lib/kubelet/pods/78924dc1-86de-47ce-bfb7-d856e1d9e25a/volumes" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.677792 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.720354 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-kz6b9"] Oct 03 13:12:58 crc kubenswrapper[4962]: E1003 13:12:58.720714 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493335aa-ea8d-4197-8c8b-af186d99f4aa" containerName="dnsmasq-dns" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.720726 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="493335aa-ea8d-4197-8c8b-af186d99f4aa" containerName="dnsmasq-dns" Oct 03 13:12:58 crc kubenswrapper[4962]: E1003 13:12:58.720741 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493335aa-ea8d-4197-8c8b-af186d99f4aa" containerName="init" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.720747 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="493335aa-ea8d-4197-8c8b-af186d99f4aa" containerName="init" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.720929 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="493335aa-ea8d-4197-8c8b-af186d99f4aa" containerName="dnsmasq-dns" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.721468 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.723477 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.723800 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7m46v" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.723943 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.749057 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-kz6b9"] Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825308 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-config\") pod \"493335aa-ea8d-4197-8c8b-af186d99f4aa\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825364 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-sb\") pod \"493335aa-ea8d-4197-8c8b-af186d99f4aa\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825449 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2p9w\" (UniqueName: \"kubernetes.io/projected/493335aa-ea8d-4197-8c8b-af186d99f4aa-kube-api-access-d2p9w\") pod \"493335aa-ea8d-4197-8c8b-af186d99f4aa\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-swift-storage-0\") pod \"493335aa-ea8d-4197-8c8b-af186d99f4aa\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825534 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-svc\") pod \"493335aa-ea8d-4197-8c8b-af186d99f4aa\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825600 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-nb\") pod \"493335aa-ea8d-4197-8c8b-af186d99f4aa\" (UID: \"493335aa-ea8d-4197-8c8b-af186d99f4aa\") " Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825836 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-config-data\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825911 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-etc-machine-id\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825938 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlc5k\" (UniqueName: \"kubernetes.io/projected/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-kube-api-access-vlc5k\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825965 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-scripts\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.825979 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-combined-ca-bundle\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.826013 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-db-sync-config-data\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.841037 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493335aa-ea8d-4197-8c8b-af186d99f4aa-kube-api-access-d2p9w" (OuterVolumeSpecName: "kube-api-access-d2p9w") pod "493335aa-ea8d-4197-8c8b-af186d99f4aa" (UID: "493335aa-ea8d-4197-8c8b-af186d99f4aa"). InnerVolumeSpecName "kube-api-access-d2p9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.897125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-config" (OuterVolumeSpecName: "config") pod "493335aa-ea8d-4197-8c8b-af186d99f4aa" (UID: "493335aa-ea8d-4197-8c8b-af186d99f4aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.907970 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "493335aa-ea8d-4197-8c8b-af186d99f4aa" (UID: "493335aa-ea8d-4197-8c8b-af186d99f4aa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.908421 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "493335aa-ea8d-4197-8c8b-af186d99f4aa" (UID: "493335aa-ea8d-4197-8c8b-af186d99f4aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927517 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-db-sync-config-data\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927586 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-config-data\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927667 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-etc-machine-id\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927690 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlc5k\" (UniqueName: \"kubernetes.io/projected/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-kube-api-access-vlc5k\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927717 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-scripts\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-combined-ca-bundle\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927787 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927799 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927810 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.927819 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2p9w\" (UniqueName: \"kubernetes.io/projected/493335aa-ea8d-4197-8c8b-af186d99f4aa-kube-api-access-d2p9w\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.928250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-etc-machine-id\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.931280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-db-sync-config-data\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.931278 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "493335aa-ea8d-4197-8c8b-af186d99f4aa" (UID: "493335aa-ea8d-4197-8c8b-af186d99f4aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.932514 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-scripts\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.939619 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-config-data\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.940393 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-combined-ca-bundle\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.945527 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlc5k\" (UniqueName: \"kubernetes.io/projected/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-kube-api-access-vlc5k\") pod \"cinder-db-sync-kz6b9\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:58 crc kubenswrapper[4962]: I1003 13:12:58.949845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "493335aa-ea8d-4197-8c8b-af186d99f4aa" (UID: "493335aa-ea8d-4197-8c8b-af186d99f4aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.025361 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-gkbr2"] Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.026793 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.028475 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.028691 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q6pcf" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.029269 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.029300 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493335aa-ea8d-4197-8c8b-af186d99f4aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.029590 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.035014 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gkbr2"] Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.048906 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.133193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjzq6\" (UniqueName: \"kubernetes.io/projected/280fc068-9a62-474f-a81f-fc5a28a7e722-kube-api-access-kjzq6\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.133291 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-config\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.133343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-combined-ca-bundle\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.164443 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" event={"ID":"493335aa-ea8d-4197-8c8b-af186d99f4aa","Type":"ContainerDied","Data":"cff96cacf42345f16b0ae68dc79c84fef7445cba2d2d43c3cc83a5161837f962"} Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.164455 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-t8t49" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.164542 4962 scope.go:117] "RemoveContainer" containerID="4dca5aa90c57dbb61b1e0f0cfad418d1058fe0f35307c84e272b53f55c58252e" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.170932 4962 generic.go:334] "Generic (PLEG): container finished" podID="29c86a63-d2f6-4f22-9d04-d6128fa7c31a" containerID="dd4e722d7156a1a3702c8352cc07fe685ce94d9af20fd5e713aa19eeae622754" exitCode=0 Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.170981 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2865v" event={"ID":"29c86a63-d2f6-4f22-9d04-d6128fa7c31a","Type":"ContainerDied","Data":"dd4e722d7156a1a3702c8352cc07fe685ce94d9af20fd5e713aa19eeae622754"} Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.179201 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ef04f91-585b-4d7a-8847-c83ec719fa7f","Type":"ContainerStarted","Data":"f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d"} Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.194261 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c36ae572-9009-4126-88fa-a27e232e4332","Type":"ContainerStarted","Data":"139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9"} Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.225873 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-t8t49"] Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.233266 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-t8t49"] Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.234540 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjzq6\" (UniqueName: \"kubernetes.io/projected/280fc068-9a62-474f-a81f-fc5a28a7e722-kube-api-access-kjzq6\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.234597 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-config\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.234692 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-combined-ca-bundle\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.239744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-combined-ca-bundle\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.241036 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-config\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.253096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjzq6\" (UniqueName: \"kubernetes.io/projected/280fc068-9a62-474f-a81f-fc5a28a7e722-kube-api-access-kjzq6\") pod \"neutron-db-sync-gkbr2\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:12:59 crc kubenswrapper[4962]: I1003 13:12:59.376529 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:13:00 crc kubenswrapper[4962]: I1003 13:13:00.237739 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493335aa-ea8d-4197-8c8b-af186d99f4aa" path="/var/lib/kubelet/pods/493335aa-ea8d-4197-8c8b-af186d99f4aa/volumes" Oct 03 13:13:01 crc kubenswrapper[4962]: I1003 13:13:01.215399 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" containerID="46e6da82e57b34e2bc7a9d03f73558235f6511a392ce8f5cec53b37074e329af" exitCode=0 Oct 03 13:13:01 crc kubenswrapper[4962]: I1003 13:13:01.215591 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4fvqx" event={"ID":"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693","Type":"ContainerDied","Data":"46e6da82e57b34e2bc7a9d03f73558235f6511a392ce8f5cec53b37074e329af"} Oct 03 13:13:02 crc kubenswrapper[4962]: I1003 13:13:02.240678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ef04f91-585b-4d7a-8847-c83ec719fa7f","Type":"ContainerStarted","Data":"a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66"} Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.867134 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2865v" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.878987 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.890505 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.8904859 podStartE2EDuration="7.8904859s" podCreationTimestamp="2025-10-03 13:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:02.382831134 +0000 UTC m=+1390.786728979" watchObservedRunningTime="2025-10-03 13:13:03.8904859 +0000 UTC m=+1392.294383735" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5ggj\" (UniqueName: \"kubernetes.io/projected/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-kube-api-access-z5ggj\") pod \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935160 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-config-data\") pod \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-credential-keys\") pod \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-logs\") pod \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935288 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-config-data\") pod \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935313 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-scripts\") pod \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935335 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v846c\" (UniqueName: \"kubernetes.io/projected/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-kube-api-access-v846c\") pod \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935355 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-scripts\") pod \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-combined-ca-bundle\") pod \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-combined-ca-bundle\") pod \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\" (UID: \"29c86a63-d2f6-4f22-9d04-d6128fa7c31a\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935460 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-fernet-keys\") pod \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\" (UID: \"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693\") " Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.935623 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-logs" (OuterVolumeSpecName: "logs") pod "29c86a63-d2f6-4f22-9d04-d6128fa7c31a" (UID: "29c86a63-d2f6-4f22-9d04-d6128fa7c31a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.936009 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.942244 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" (UID: "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.947007 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-kube-api-access-v846c" (OuterVolumeSpecName: "kube-api-access-v846c") pod "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" (UID: "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693"). InnerVolumeSpecName "kube-api-access-v846c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.947060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-scripts" (OuterVolumeSpecName: "scripts") pod "29c86a63-d2f6-4f22-9d04-d6128fa7c31a" (UID: "29c86a63-d2f6-4f22-9d04-d6128fa7c31a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.947139 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" (UID: "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.948543 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-kube-api-access-z5ggj" (OuterVolumeSpecName: "kube-api-access-z5ggj") pod "29c86a63-d2f6-4f22-9d04-d6128fa7c31a" (UID: "29c86a63-d2f6-4f22-9d04-d6128fa7c31a"). InnerVolumeSpecName "kube-api-access-z5ggj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.955085 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-scripts" (OuterVolumeSpecName: "scripts") pod "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" (UID: "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.965106 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-config-data" (OuterVolumeSpecName: "config-data") pod "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" (UID: "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.971315 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-config-data" (OuterVolumeSpecName: "config-data") pod "29c86a63-d2f6-4f22-9d04-d6128fa7c31a" (UID: "29c86a63-d2f6-4f22-9d04-d6128fa7c31a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.972110 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" (UID: "8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:03 crc kubenswrapper[4962]: I1003 13:13:03.982120 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29c86a63-d2f6-4f22-9d04-d6128fa7c31a" (UID: "29c86a63-d2f6-4f22-9d04-d6128fa7c31a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038160 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038196 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5ggj\" (UniqueName: \"kubernetes.io/projected/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-kube-api-access-z5ggj\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038207 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038216 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038225 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038233 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038242 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v846c\" (UniqueName: \"kubernetes.io/projected/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-kube-api-access-v846c\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038249 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038258 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.038265 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86a63-d2f6-4f22-9d04-d6128fa7c31a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.256759 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4fvqx" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.256762 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4fvqx" event={"ID":"8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693","Type":"ContainerDied","Data":"1a5fa6bab2c349a7c7cda213ca1256a9045ab179d48b3a17b59cbfa6cfaeeeab"} Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.256818 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5fa6bab2c349a7c7cda213ca1256a9045ab179d48b3a17b59cbfa6cfaeeeab" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.259154 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2865v" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.259385 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2865v" event={"ID":"29c86a63-d2f6-4f22-9d04-d6128fa7c31a","Type":"ContainerDied","Data":"bf170b4d9b1b95848bfce23a5ebadb26e1f2453031bea79ef5a4e04ab63199da"} Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.259421 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf170b4d9b1b95848bfce23a5ebadb26e1f2453031bea79ef5a4e04ab63199da" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.262001 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c36ae572-9009-4126-88fa-a27e232e4332","Type":"ContainerStarted","Data":"d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3"} Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.294094 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.2940748 podStartE2EDuration="8.2940748s" podCreationTimestamp="2025-10-03 13:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:04.28847501 +0000 UTC m=+1392.692372855" watchObservedRunningTime="2025-10-03 13:13:04.2940748 +0000 UTC m=+1392.697972635" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.959515 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6456949cf6-r4n9q"] Oct 03 13:13:04 crc kubenswrapper[4962]: E1003 13:13:04.960166 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" containerName="keystone-bootstrap" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.960182 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" containerName="keystone-bootstrap" Oct 03 13:13:04 crc kubenswrapper[4962]: E1003 13:13:04.960205 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c86a63-d2f6-4f22-9d04-d6128fa7c31a" containerName="placement-db-sync" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.960213 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c86a63-d2f6-4f22-9d04-d6128fa7c31a" containerName="placement-db-sync" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.960407 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" containerName="keystone-bootstrap" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.960432 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c86a63-d2f6-4f22-9d04-d6128fa7c31a" containerName="placement-db-sync" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.964493 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.969621 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-cmqqg" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.969791 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.971276 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.971420 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.971518 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 13:13:04 crc kubenswrapper[4962]: I1003 13:13:04.976865 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6456949cf6-r4n9q"] Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.041111 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-58d67d97c8-pnjp8"] Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.042202 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.052123 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.052147 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.052423 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.052468 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.052625 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.052750 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rbx44" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.054205 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-58d67d97c8-pnjp8"] Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.163972 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-scripts\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164068 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-fernet-keys\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164099 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-internal-tls-certs\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvkr2\" (UniqueName: \"kubernetes.io/projected/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-kube-api-access-nvkr2\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164161 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-public-tls-certs\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164188 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhjw\" (UniqueName: \"kubernetes.io/projected/1289d443-56d2-4f63-8802-66bcd0569b3b-kube-api-access-pvhjw\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164232 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-public-tls-certs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164256 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-config-data\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164296 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d443-56d2-4f63-8802-66bcd0569b3b-logs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164333 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-combined-ca-bundle\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164359 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-combined-ca-bundle\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164381 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-internal-tls-certs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-config-data\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164420 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-credential-keys\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.164439 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-scripts\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d443-56d2-4f63-8802-66bcd0569b3b-logs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-combined-ca-bundle\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-combined-ca-bundle\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266343 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-internal-tls-certs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-config-data\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-credential-keys\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-scripts\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266482 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-scripts\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266528 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-fernet-keys\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266550 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-internal-tls-certs\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266567 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvkr2\" (UniqueName: \"kubernetes.io/projected/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-kube-api-access-nvkr2\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266595 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-public-tls-certs\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhjw\" (UniqueName: \"kubernetes.io/projected/1289d443-56d2-4f63-8802-66bcd0569b3b-kube-api-access-pvhjw\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266661 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-public-tls-certs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-config-data\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.266700 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d443-56d2-4f63-8802-66bcd0569b3b-logs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.272141 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-public-tls-certs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.273104 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-scripts\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.274958 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-combined-ca-bundle\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.275052 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-internal-tls-certs\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.275453 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-internal-tls-certs\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.276048 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-scripts\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.276064 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-combined-ca-bundle\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.276271 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-public-tls-certs\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.276550 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-config-data\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.276713 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-credential-keys\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.277804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-fernet-keys\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.286755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-config-data\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.292967 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvkr2\" (UniqueName: \"kubernetes.io/projected/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-kube-api-access-nvkr2\") pod \"keystone-58d67d97c8-pnjp8\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.297343 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhjw\" (UniqueName: \"kubernetes.io/projected/1289d443-56d2-4f63-8802-66bcd0569b3b-kube-api-access-pvhjw\") pod \"placement-6456949cf6-r4n9q\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.367845 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:05 crc kubenswrapper[4962]: I1003 13:13:05.584279 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.149312 4962 scope.go:117] "RemoveContainer" containerID="a39cb5e7e2e55905be6936881fda8317e48fd44199a916066642512d2832a221" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.779451 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gkbr2"] Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.785251 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.785298 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.787354 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-kz6b9"] Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.797845 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.797891 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.829029 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.829611 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.834095 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.868658 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.909865 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-58d67d97c8-pnjp8"] Oct 03 13:13:06 crc kubenswrapper[4962]: I1003 13:13:06.920493 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6456949cf6-r4n9q"] Oct 03 13:13:06 crc kubenswrapper[4962]: W1003 13:13:06.930846 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6cba65d_0ae5_4a81_88c1_da4e07d7a803.slice/crio-840fbb208720d4586b8c6a97e5617af496deb903b6b103ec0a9614b97aeeaeb9 WatchSource:0}: Error finding container 840fbb208720d4586b8c6a97e5617af496deb903b6b103ec0a9614b97aeeaeb9: Status 404 returned error can't find the container with id 840fbb208720d4586b8c6a97e5617af496deb903b6b103ec0a9614b97aeeaeb9 Oct 03 13:13:06 crc kubenswrapper[4962]: W1003 13:13:06.933882 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1289d443_56d2_4f63_8802_66bcd0569b3b.slice/crio-d91e78e7691d71bfdbb7e92f17ddbf0c1bacccfe253711a3a3db928ce9630620 WatchSource:0}: Error finding container d91e78e7691d71bfdbb7e92f17ddbf0c1bacccfe253711a3a3db928ce9630620: Status 404 returned error can't find the container with id d91e78e7691d71bfdbb7e92f17ddbf0c1bacccfe253711a3a3db928ce9630620 Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.323612 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-58d67d97c8-pnjp8" event={"ID":"a6cba65d-0ae5-4a81-88c1-da4e07d7a803","Type":"ContainerStarted","Data":"3108a39b0723e787d2db8b185df6254591ba7ffd8691d08c951e273ac8405e51"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.323983 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.323997 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-58d67d97c8-pnjp8" event={"ID":"a6cba65d-0ae5-4a81-88c1-da4e07d7a803","Type":"ContainerStarted","Data":"840fbb208720d4586b8c6a97e5617af496deb903b6b103ec0a9614b97aeeaeb9"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.325166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2wz97" event={"ID":"22874ecf-641f-46a1-bbb5-4d27b38bf001","Type":"ContainerStarted","Data":"d17ebef49d6db3405f46c76620af3c4ad52867905601777304861ab2ba50d3a2"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.327608 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e1e8d-9d15-4eef-a553-2af7c59998e3","Type":"ContainerStarted","Data":"2a9a5082cb69eea6f1efba2560d7c088a1572ce55dbf4bcf4ca90d2ed0560744"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.328945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gkbr2" event={"ID":"280fc068-9a62-474f-a81f-fc5a28a7e722","Type":"ContainerStarted","Data":"0ef7a555af8ba3763ace041261009ed654126cd2f78762f2ab1c9bc4bf7072db"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.328972 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gkbr2" event={"ID":"280fc068-9a62-474f-a81f-fc5a28a7e722","Type":"ContainerStarted","Data":"1af960500f504b3d426cdf54816480888b94751f88b2beb5f99995f77cd833ce"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.331097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6456949cf6-r4n9q" event={"ID":"1289d443-56d2-4f63-8802-66bcd0569b3b","Type":"ContainerStarted","Data":"b92f6a632cc9c0dff9f450965eee31724d28f079d91c0b8852b080e2ed919e29"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.331122 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6456949cf6-r4n9q" event={"ID":"1289d443-56d2-4f63-8802-66bcd0569b3b","Type":"ContainerStarted","Data":"e9b4cb84ef4c21a8595bf182936df461ef5cc7e4bb630f5cdc4490a12d404462"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.331132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6456949cf6-r4n9q" event={"ID":"1289d443-56d2-4f63-8802-66bcd0569b3b","Type":"ContainerStarted","Data":"d91e78e7691d71bfdbb7e92f17ddbf0c1bacccfe253711a3a3db928ce9630620"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.331548 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.331576 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.332531 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kz6b9" event={"ID":"866c8e6b-3fdd-442c-98d4-cf44b6ef098c","Type":"ContainerStarted","Data":"3950de5905a4dc7a8e1191cd18d0ca1f2417da16ba610af0e92e93c717cf82f4"} Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.333071 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.333097 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.333108 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.333117 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.346452 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-58d67d97c8-pnjp8" podStartSLOduration=2.346433466 podStartE2EDuration="2.346433466s" podCreationTimestamp="2025-10-03 13:13:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:07.343246511 +0000 UTC m=+1395.747144346" watchObservedRunningTime="2025-10-03 13:13:07.346433466 +0000 UTC m=+1395.750331301" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.363346 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-2wz97" podStartSLOduration=4.312255048 podStartE2EDuration="14.36332881s" podCreationTimestamp="2025-10-03 13:12:53 +0000 UTC" firstStartedPulling="2025-10-03 13:12:56.162080004 +0000 UTC m=+1384.565977839" lastFinishedPulling="2025-10-03 13:13:06.213153766 +0000 UTC m=+1394.617051601" observedRunningTime="2025-10-03 13:13:07.359335043 +0000 UTC m=+1395.763232878" watchObservedRunningTime="2025-10-03 13:13:07.36332881 +0000 UTC m=+1395.767226645" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.410816 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6456949cf6-r4n9q" podStartSLOduration=3.4107916449999998 podStartE2EDuration="3.410791645s" podCreationTimestamp="2025-10-03 13:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:07.392088743 +0000 UTC m=+1395.795986578" watchObservedRunningTime="2025-10-03 13:13:07.410791645 +0000 UTC m=+1395.814689480" Oct 03 13:13:07 crc kubenswrapper[4962]: I1003 13:13:07.412968 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-gkbr2" podStartSLOduration=8.412958223 podStartE2EDuration="8.412958223s" podCreationTimestamp="2025-10-03 13:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:07.407970619 +0000 UTC m=+1395.811868454" watchObservedRunningTime="2025-10-03 13:13:07.412958223 +0000 UTC m=+1395.816856058" Oct 03 13:13:09 crc kubenswrapper[4962]: I1003 13:13:09.444345 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 13:13:09 crc kubenswrapper[4962]: I1003 13:13:09.446446 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 13:13:09 crc kubenswrapper[4962]: I1003 13:13:09.466666 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 13:13:09 crc kubenswrapper[4962]: I1003 13:13:09.519438 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.055265 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xkrxj"] Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.057863 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.067695 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkrxj"] Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.107781 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx7rl\" (UniqueName: \"kubernetes.io/projected/f21bb1eb-c956-4d3f-b7e3-199749af43df-kube-api-access-bx7rl\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.107896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-utilities\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.107933 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-catalog-content\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.210058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-utilities\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.210153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-catalog-content\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.210236 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx7rl\" (UniqueName: \"kubernetes.io/projected/f21bb1eb-c956-4d3f-b7e3-199749af43df-kube-api-access-bx7rl\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.210912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-utilities\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.212115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-catalog-content\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.246683 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx7rl\" (UniqueName: \"kubernetes.io/projected/f21bb1eb-c956-4d3f-b7e3-199749af43df-kube-api-access-bx7rl\") pod \"redhat-marketplace-xkrxj\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.396043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:11 crc kubenswrapper[4962]: I1003 13:13:11.411903 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 13:13:12 crc kubenswrapper[4962]: I1003 13:13:12.014949 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkrxj"] Oct 03 13:13:12 crc kubenswrapper[4962]: I1003 13:13:12.377761 4962 generic.go:334] "Generic (PLEG): container finished" podID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerID="78a1528336a19d142eec74acabd6b6c369b5cc5054582317455a473a9ae7c67f" exitCode=0 Oct 03 13:13:12 crc kubenswrapper[4962]: I1003 13:13:12.377853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkrxj" event={"ID":"f21bb1eb-c956-4d3f-b7e3-199749af43df","Type":"ContainerDied","Data":"78a1528336a19d142eec74acabd6b6c369b5cc5054582317455a473a9ae7c67f"} Oct 03 13:13:12 crc kubenswrapper[4962]: I1003 13:13:12.377880 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkrxj" event={"ID":"f21bb1eb-c956-4d3f-b7e3-199749af43df","Type":"ContainerStarted","Data":"279da3bc728492249f018561caed8f3cbeae5de438315df12f0b13cadb99712b"} Oct 03 13:13:13 crc kubenswrapper[4962]: I1003 13:13:13.386548 4962 generic.go:334] "Generic (PLEG): container finished" podID="22874ecf-641f-46a1-bbb5-4d27b38bf001" containerID="d17ebef49d6db3405f46c76620af3c4ad52867905601777304861ab2ba50d3a2" exitCode=0 Oct 03 13:13:13 crc kubenswrapper[4962]: I1003 13:13:13.386837 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2wz97" event={"ID":"22874ecf-641f-46a1-bbb5-4d27b38bf001","Type":"ContainerDied","Data":"d17ebef49d6db3405f46c76620af3c4ad52867905601777304861ab2ba50d3a2"} Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.341105 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2wz97" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.384526 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28j99\" (UniqueName: \"kubernetes.io/projected/22874ecf-641f-46a1-bbb5-4d27b38bf001-kube-api-access-28j99\") pod \"22874ecf-641f-46a1-bbb5-4d27b38bf001\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.384771 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-combined-ca-bundle\") pod \"22874ecf-641f-46a1-bbb5-4d27b38bf001\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.384820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-db-sync-config-data\") pod \"22874ecf-641f-46a1-bbb5-4d27b38bf001\" (UID: \"22874ecf-641f-46a1-bbb5-4d27b38bf001\") " Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.391251 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "22874ecf-641f-46a1-bbb5-4d27b38bf001" (UID: "22874ecf-641f-46a1-bbb5-4d27b38bf001"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.391800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22874ecf-641f-46a1-bbb5-4d27b38bf001-kube-api-access-28j99" (OuterVolumeSpecName: "kube-api-access-28j99") pod "22874ecf-641f-46a1-bbb5-4d27b38bf001" (UID: "22874ecf-641f-46a1-bbb5-4d27b38bf001"). InnerVolumeSpecName "kube-api-access-28j99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.406009 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2wz97" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.405955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2wz97" event={"ID":"22874ecf-641f-46a1-bbb5-4d27b38bf001","Type":"ContainerDied","Data":"e79484f6c9ffa7b36c22c8e5b8e9340a243a48c41f6bd2eb2325c3acdd6cfc92"} Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.406147 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e79484f6c9ffa7b36c22c8e5b8e9340a243a48c41f6bd2eb2325c3acdd6cfc92" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.422100 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22874ecf-641f-46a1-bbb5-4d27b38bf001" (UID: "22874ecf-641f-46a1-bbb5-4d27b38bf001"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.487286 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.487319 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22874ecf-641f-46a1-bbb5-4d27b38bf001-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.487330 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28j99\" (UniqueName: \"kubernetes.io/projected/22874ecf-641f-46a1-bbb5-4d27b38bf001-kube-api-access-28j99\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.653782 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-77d888f4df-52rjc"] Oct 03 13:13:15 crc kubenswrapper[4962]: E1003 13:13:15.654111 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22874ecf-641f-46a1-bbb5-4d27b38bf001" containerName="barbican-db-sync" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.654127 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="22874ecf-641f-46a1-bbb5-4d27b38bf001" containerName="barbican-db-sync" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.654290 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="22874ecf-641f-46a1-bbb5-4d27b38bf001" containerName="barbican-db-sync" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.655365 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.659464 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.673250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77d888f4df-52rjc"] Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.712145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ecb3944-c441-4879-8220-aa32d7436c1f-logs\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.712247 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-combined-ca-bundle\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.712311 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgw4z\" (UniqueName: \"kubernetes.io/projected/2ecb3944-c441-4879-8220-aa32d7436c1f-kube-api-access-bgw4z\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.712348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data-custom\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.712409 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.792885 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-tptdl"] Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.794725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.828707 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-tptdl"] Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.830252 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ecb3944-c441-4879-8220-aa32d7436c1f-logs\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.830314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-combined-ca-bundle\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.830361 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgw4z\" (UniqueName: \"kubernetes.io/projected/2ecb3944-c441-4879-8220-aa32d7436c1f-kube-api-access-bgw4z\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.830391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data-custom\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.830420 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.847407 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ecb3944-c441-4879-8220-aa32d7436c1f-logs\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.879205 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgw4z\" (UniqueName: \"kubernetes.io/projected/2ecb3944-c441-4879-8220-aa32d7436c1f-kube-api-access-bgw4z\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.889287 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.893367 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-bd6989694-qnv2s"] Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.895520 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.901979 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-combined-ca-bundle\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.918548 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.918978 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data-custom\") pod \"barbican-worker-77d888f4df-52rjc\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.934317 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.934369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.934388 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.934469 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.934505 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-config\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.934521 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gckb\" (UniqueName: \"kubernetes.io/projected/867cc47a-7f55-4690-85b7-62cc6c041b65-kube-api-access-5gckb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:15 crc kubenswrapper[4962]: I1003 13:13:15.949243 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-bd6989694-qnv2s"] Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035758 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-combined-ca-bundle\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035795 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn8zv\" (UniqueName: \"kubernetes.io/projected/3c111271-43ed-48b3-b6ed-a6d02efb9113-kube-api-access-vn8zv\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035846 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035871 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035948 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-config\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035963 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gckb\" (UniqueName: \"kubernetes.io/projected/867cc47a-7f55-4690-85b7-62cc6c041b65-kube-api-access-5gckb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.035994 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c111271-43ed-48b3-b6ed-a6d02efb9113-logs\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.036017 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data-custom\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.037056 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.037595 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-config\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.037813 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.038369 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.038465 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.042157 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6dd6cbcb6d-pxvpn"] Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.044098 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.051995 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.055874 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.071790 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gckb\" (UniqueName: \"kubernetes.io/projected/867cc47a-7f55-4690-85b7-62cc6c041b65-kube-api-access-5gckb\") pod \"dnsmasq-dns-59d5ff467f-tptdl\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.072277 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dd6cbcb6d-pxvpn"] Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138016 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c111271-43ed-48b3-b6ed-a6d02efb9113-logs\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138535 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data-custom\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138594 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-combined-ca-bundle\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138731 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn8zv\" (UniqueName: \"kubernetes.io/projected/3c111271-43ed-48b3-b6ed-a6d02efb9113-kube-api-access-vn8zv\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data-custom\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138885 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfbrl\" (UniqueName: \"kubernetes.io/projected/a730dfa1-b919-4d19-bf40-06e5c3348d6f-kube-api-access-qfbrl\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138920 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-combined-ca-bundle\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138942 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a730dfa1-b919-4d19-bf40-06e5c3348d6f-logs\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.138471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c111271-43ed-48b3-b6ed-a6d02efb9113-logs\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.141937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data-custom\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.142410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-combined-ca-bundle\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.142723 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.160480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn8zv\" (UniqueName: \"kubernetes.io/projected/3c111271-43ed-48b3-b6ed-a6d02efb9113-kube-api-access-vn8zv\") pod \"barbican-keystone-listener-bd6989694-qnv2s\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.182215 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.240676 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-combined-ca-bundle\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.240729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a730dfa1-b919-4d19-bf40-06e5c3348d6f-logs\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.240835 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.240916 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data-custom\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.240967 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfbrl\" (UniqueName: \"kubernetes.io/projected/a730dfa1-b919-4d19-bf40-06e5c3348d6f-kube-api-access-qfbrl\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.241770 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a730dfa1-b919-4d19-bf40-06e5c3348d6f-logs\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.244079 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-combined-ca-bundle\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.245766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.249309 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data-custom\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.261208 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfbrl\" (UniqueName: \"kubernetes.io/projected/a730dfa1-b919-4d19-bf40-06e5c3348d6f-kube-api-access-qfbrl\") pod \"barbican-api-6dd6cbcb6d-pxvpn\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.281515 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:13:16 crc kubenswrapper[4962]: I1003 13:13:16.381249 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.032403 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b7fd754f4-rx9k9"] Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.038422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.040628 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.044319 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.098774 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b7fd754f4-rx9k9"] Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.115811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-internal-tls-certs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.115923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-logs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.115959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.116019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-combined-ca-bundle\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.116053 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-public-tls-certs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.116091 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvmrv\" (UniqueName: \"kubernetes.io/projected/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-kube-api-access-hvmrv\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.116118 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data-custom\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.217969 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-logs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.218075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.218179 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-combined-ca-bundle\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.218205 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-public-tls-certs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.218237 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvmrv\" (UniqueName: \"kubernetes.io/projected/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-kube-api-access-hvmrv\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.218270 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data-custom\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.218324 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-internal-tls-certs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.218387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-logs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.223853 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-combined-ca-bundle\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.224342 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-internal-tls-certs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.224351 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data-custom\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.224628 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-public-tls-certs\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.225307 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.234315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvmrv\" (UniqueName: \"kubernetes.io/projected/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-kube-api-access-hvmrv\") pod \"barbican-api-6b7fd754f4-rx9k9\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:20 crc kubenswrapper[4962]: I1003 13:13:20.375252 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:24 crc kubenswrapper[4962]: I1003 13:13:24.487402 4962 generic.go:334] "Generic (PLEG): container finished" podID="280fc068-9a62-474f-a81f-fc5a28a7e722" containerID="0ef7a555af8ba3763ace041261009ed654126cd2f78762f2ab1c9bc4bf7072db" exitCode=0 Oct 03 13:13:24 crc kubenswrapper[4962]: I1003 13:13:24.487619 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gkbr2" event={"ID":"280fc068-9a62-474f-a81f-fc5a28a7e722","Type":"ContainerDied","Data":"0ef7a555af8ba3763ace041261009ed654126cd2f78762f2ab1c9bc4bf7072db"} Oct 03 13:13:24 crc kubenswrapper[4962]: E1003 13:13:24.697986 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 03 13:13:24 crc kubenswrapper[4962]: E1003 13:13:24.698154 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vlc5k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-kz6b9_openstack(866c8e6b-3fdd-442c-98d4-cf44b6ef098c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 13:13:24 crc kubenswrapper[4962]: E1003 13:13:24.699291 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-kz6b9" podUID="866c8e6b-3fdd-442c-98d4-cf44b6ef098c" Oct 03 13:13:25 crc kubenswrapper[4962]: E1003 13:13:25.022411 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 03 13:13:25 crc kubenswrapper[4962]: E1003 13:13:25.024767 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngmnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c09e1e8d-9d15-4eef-a553-2af7c59998e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 13:13:25 crc kubenswrapper[4962]: E1003 13:13:25.027740 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.524963 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-tptdl"] Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.530321 4962 generic.go:334] "Generic (PLEG): container finished" podID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerID="36c1e5d4cc9238450eba3e5fd7117e9e2030bf97281706b1d15b2472c64fffc7" exitCode=0 Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.530528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkrxj" event={"ID":"f21bb1eb-c956-4d3f-b7e3-199749af43df","Type":"ContainerDied","Data":"36c1e5d4cc9238450eba3e5fd7117e9e2030bf97281706b1d15b2472c64fffc7"} Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.531278 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="sg-core" containerID="cri-o://2a9a5082cb69eea6f1efba2560d7c088a1572ce55dbf4bcf4ca90d2ed0560744" gracePeriod=30 Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.531274 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="ceilometer-central-agent" containerID="cri-o://922457fe5519a12a1f756b9f0601dc6bb170b210e8c7fcb0a9bfa22bb5817803" gracePeriod=30 Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.531346 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="ceilometer-notification-agent" containerID="cri-o://4fc87af0adc8de378b2e85921a90879f9ab1b63fd92d450de2af297793e6c95b" gracePeriod=30 Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.532197 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77d888f4df-52rjc"] Oct 03 13:13:25 crc kubenswrapper[4962]: E1003 13:13:25.532589 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-kz6b9" podUID="866c8e6b-3fdd-442c-98d4-cf44b6ef098c" Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.621556 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-bd6989694-qnv2s"] Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.696341 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b7fd754f4-rx9k9"] Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.711430 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dd6cbcb6d-pxvpn"] Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.838623 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.926227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-config\") pod \"280fc068-9a62-474f-a81f-fc5a28a7e722\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.926686 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjzq6\" (UniqueName: \"kubernetes.io/projected/280fc068-9a62-474f-a81f-fc5a28a7e722-kube-api-access-kjzq6\") pod \"280fc068-9a62-474f-a81f-fc5a28a7e722\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.926855 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-combined-ca-bundle\") pod \"280fc068-9a62-474f-a81f-fc5a28a7e722\" (UID: \"280fc068-9a62-474f-a81f-fc5a28a7e722\") " Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.932089 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280fc068-9a62-474f-a81f-fc5a28a7e722-kube-api-access-kjzq6" (OuterVolumeSpecName: "kube-api-access-kjzq6") pod "280fc068-9a62-474f-a81f-fc5a28a7e722" (UID: "280fc068-9a62-474f-a81f-fc5a28a7e722"). InnerVolumeSpecName "kube-api-access-kjzq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.950866 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-config" (OuterVolumeSpecName: "config") pod "280fc068-9a62-474f-a81f-fc5a28a7e722" (UID: "280fc068-9a62-474f-a81f-fc5a28a7e722"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:25 crc kubenswrapper[4962]: I1003 13:13:25.952012 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "280fc068-9a62-474f-a81f-fc5a28a7e722" (UID: "280fc068-9a62-474f-a81f-fc5a28a7e722"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.028573 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.028604 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/280fc068-9a62-474f-a81f-fc5a28a7e722-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.028614 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjzq6\" (UniqueName: \"kubernetes.io/projected/280fc068-9a62-474f-a81f-fc5a28a7e722-kube-api-access-kjzq6\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.544534 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" event={"ID":"a730dfa1-b919-4d19-bf40-06e5c3348d6f","Type":"ContainerStarted","Data":"eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.544829 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" event={"ID":"a730dfa1-b919-4d19-bf40-06e5c3348d6f","Type":"ContainerStarted","Data":"4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.544843 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" event={"ID":"a730dfa1-b919-4d19-bf40-06e5c3348d6f","Type":"ContainerStarted","Data":"d7988da2603c8d892881edf776679c2cc085608d63b8fa9f20a502f329002d31"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.545183 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.545202 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.551379 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7fd754f4-rx9k9" event={"ID":"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8","Type":"ContainerStarted","Data":"efdd11f8bd8386aa2fc051d59f9344ed094988bb97638532765b2b52ec56a7ba"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.551413 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7fd754f4-rx9k9" event={"ID":"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8","Type":"ContainerStarted","Data":"2fe1fb6596fd4bab23e8f5b8fffa9f204b8288c3c24b70ea1583257b39048287"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.551425 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7fd754f4-rx9k9" event={"ID":"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8","Type":"ContainerStarted","Data":"a9291b6ad0144fe4b4bf210a17913502b8ab701d3ea438dc75b634a55814a0af"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.551457 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.551845 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.553156 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d888f4df-52rjc" event={"ID":"2ecb3944-c441-4879-8220-aa32d7436c1f","Type":"ContainerStarted","Data":"2735c0850d8792835e645f3117d147b10140f11b42d72148725ebea36cd84d25"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.556564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkrxj" event={"ID":"f21bb1eb-c956-4d3f-b7e3-199749af43df","Type":"ContainerStarted","Data":"bb04299e38f0a1f2e1cc9bde34ef2449edd45207ce23d8c93e574a50d6c31462"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.560238 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" podStartSLOduration=11.560226188 podStartE2EDuration="11.560226188s" podCreationTimestamp="2025-10-03 13:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:26.559041366 +0000 UTC m=+1414.962939201" watchObservedRunningTime="2025-10-03 13:13:26.560226188 +0000 UTC m=+1414.964124023" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.562828 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" event={"ID":"3c111271-43ed-48b3-b6ed-a6d02efb9113","Type":"ContainerStarted","Data":"4689223207e192d071dd8574b0bb78a7f6fedc890b8f3664d99710eb87142619"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.567395 4962 generic.go:334] "Generic (PLEG): container finished" podID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerID="2a9a5082cb69eea6f1efba2560d7c088a1572ce55dbf4bcf4ca90d2ed0560744" exitCode=2 Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.567433 4962 generic.go:334] "Generic (PLEG): container finished" podID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerID="922457fe5519a12a1f756b9f0601dc6bb170b210e8c7fcb0a9bfa22bb5817803" exitCode=0 Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.567478 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e1e8d-9d15-4eef-a553-2af7c59998e3","Type":"ContainerDied","Data":"2a9a5082cb69eea6f1efba2560d7c088a1572ce55dbf4bcf4ca90d2ed0560744"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.567506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e1e8d-9d15-4eef-a553-2af7c59998e3","Type":"ContainerDied","Data":"922457fe5519a12a1f756b9f0601dc6bb170b210e8c7fcb0a9bfa22bb5817803"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.569317 4962 generic.go:334] "Generic (PLEG): container finished" podID="867cc47a-7f55-4690-85b7-62cc6c041b65" containerID="c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040" exitCode=0 Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.569407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" event={"ID":"867cc47a-7f55-4690-85b7-62cc6c041b65","Type":"ContainerDied","Data":"c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.569441 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" event={"ID":"867cc47a-7f55-4690-85b7-62cc6c041b65","Type":"ContainerStarted","Data":"b7396ed5da8bc65c53d14c3c516502267200117c6861e29f1d1ed366c6a39bbb"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.575234 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gkbr2" event={"ID":"280fc068-9a62-474f-a81f-fc5a28a7e722","Type":"ContainerDied","Data":"1af960500f504b3d426cdf54816480888b94751f88b2beb5f99995f77cd833ce"} Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.575275 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af960500f504b3d426cdf54816480888b94751f88b2beb5f99995f77cd833ce" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.575307 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gkbr2" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.586069 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xkrxj" podStartSLOduration=1.939870647 podStartE2EDuration="15.586053142s" podCreationTimestamp="2025-10-03 13:13:11 +0000 UTC" firstStartedPulling="2025-10-03 13:13:12.38191987 +0000 UTC m=+1400.785817705" lastFinishedPulling="2025-10-03 13:13:26.028102365 +0000 UTC m=+1414.432000200" observedRunningTime="2025-10-03 13:13:26.583891344 +0000 UTC m=+1414.987789189" watchObservedRunningTime="2025-10-03 13:13:26.586053142 +0000 UTC m=+1414.989950977" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.608820 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b7fd754f4-rx9k9" podStartSLOduration=6.608800693 podStartE2EDuration="6.608800693s" podCreationTimestamp="2025-10-03 13:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:26.607159309 +0000 UTC m=+1415.011057144" watchObservedRunningTime="2025-10-03 13:13:26.608800693 +0000 UTC m=+1415.012698528" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.741158 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-tptdl"] Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.795928 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-7bckc"] Oct 03 13:13:26 crc kubenswrapper[4962]: E1003 13:13:26.796410 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280fc068-9a62-474f-a81f-fc5a28a7e722" containerName="neutron-db-sync" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.796427 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="280fc068-9a62-474f-a81f-fc5a28a7e722" containerName="neutron-db-sync" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.797793 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="280fc068-9a62-474f-a81f-fc5a28a7e722" containerName="neutron-db-sync" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.799241 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.810725 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-7bckc"] Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.843982 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.844034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.844063 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.844124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.844171 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-config\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.844192 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkm2b\" (UniqueName: \"kubernetes.io/projected/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-kube-api-access-tkm2b\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.934984 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-776f46cfdd-zh4jf"] Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.937113 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.945082 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.945164 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.945164 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q6pcf" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.945309 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.946134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-config\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.946164 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkm2b\" (UniqueName: \"kubernetes.io/projected/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-kube-api-access-tkm2b\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.946224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.946247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.946271 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.946327 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.947132 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.947472 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.947895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.948273 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.948333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-config\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.973473 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-776f46cfdd-zh4jf"] Oct 03 13:13:26 crc kubenswrapper[4962]: I1003 13:13:26.977567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkm2b\" (UniqueName: \"kubernetes.io/projected/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-kube-api-access-tkm2b\") pod \"dnsmasq-dns-75c8ddd69c-7bckc\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.049924 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-httpd-config\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.050268 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrd48\" (UniqueName: \"kubernetes.io/projected/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-kube-api-access-mrd48\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.050363 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-combined-ca-bundle\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.050430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-ovndb-tls-certs\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.050497 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-config\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.131136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.152377 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-httpd-config\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.152418 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrd48\" (UniqueName: \"kubernetes.io/projected/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-kube-api-access-mrd48\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.152473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-combined-ca-bundle\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.152511 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-ovndb-tls-certs\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.152548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-config\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.156385 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-config\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.158712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-httpd-config\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.158954 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-combined-ca-bundle\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.159058 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-ovndb-tls-certs\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.183339 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrd48\" (UniqueName: \"kubernetes.io/projected/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-kube-api-access-mrd48\") pod \"neutron-776f46cfdd-zh4jf\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:27 crc kubenswrapper[4962]: I1003 13:13:27.313238 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.223174 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-776f46cfdd-zh4jf"] Oct 03 13:13:28 crc kubenswrapper[4962]: W1003 13:13:28.225466 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8103ea_25b6_4a47_bb63_ac20d8fef59f.slice/crio-0da0ecda038060c509ea4dfa6696447c81bc7b662e1186efa916f5bfa4170ac7 WatchSource:0}: Error finding container 0da0ecda038060c509ea4dfa6696447c81bc7b662e1186efa916f5bfa4170ac7: Status 404 returned error can't find the container with id 0da0ecda038060c509ea4dfa6696447c81bc7b662e1186efa916f5bfa4170ac7 Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.275737 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-7bckc"] Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.621212 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d888f4df-52rjc" event={"ID":"2ecb3944-c441-4879-8220-aa32d7436c1f","Type":"ContainerStarted","Data":"eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.621299 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d888f4df-52rjc" event={"ID":"2ecb3944-c441-4879-8220-aa32d7436c1f","Type":"ContainerStarted","Data":"b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.622883 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" event={"ID":"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3","Type":"ContainerStarted","Data":"bb63da13d1a54840396a2eec391baf02ae9def0051aa74c37979642fc4fce534"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.622923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" event={"ID":"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3","Type":"ContainerStarted","Data":"3a3ed165053e416ce7af646f5dd905837b0044e5735b08ca08878c6b4e5521e3"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.625578 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" event={"ID":"3c111271-43ed-48b3-b6ed-a6d02efb9113","Type":"ContainerStarted","Data":"31ec554c86926fa60a6d1b72601dfc4ca004a83f21bd45f79846d05688388ebf"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.625617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" event={"ID":"3c111271-43ed-48b3-b6ed-a6d02efb9113","Type":"ContainerStarted","Data":"86b2ecabe6ed78973d278885e95f62a78704e8d2b70f094254e52932b6e6c618"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.628184 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776f46cfdd-zh4jf" event={"ID":"7f8103ea-25b6-4a47-bb63-ac20d8fef59f","Type":"ContainerStarted","Data":"f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.628237 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776f46cfdd-zh4jf" event={"ID":"7f8103ea-25b6-4a47-bb63-ac20d8fef59f","Type":"ContainerStarted","Data":"0da0ecda038060c509ea4dfa6696447c81bc7b662e1186efa916f5bfa4170ac7"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.631598 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" podUID="867cc47a-7f55-4690-85b7-62cc6c041b65" containerName="dnsmasq-dns" containerID="cri-o://3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810" gracePeriod=10 Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.631895 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" event={"ID":"867cc47a-7f55-4690-85b7-62cc6c041b65","Type":"ContainerStarted","Data":"3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810"} Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.631936 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.653835 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-77d888f4df-52rjc" podStartSLOduration=11.633306741 podStartE2EDuration="13.653817411s" podCreationTimestamp="2025-10-03 13:13:15 +0000 UTC" firstStartedPulling="2025-10-03 13:13:25.538099044 +0000 UTC m=+1413.941996879" lastFinishedPulling="2025-10-03 13:13:27.558609714 +0000 UTC m=+1415.962507549" observedRunningTime="2025-10-03 13:13:28.641993084 +0000 UTC m=+1417.045890919" watchObservedRunningTime="2025-10-03 13:13:28.653817411 +0000 UTC m=+1417.057715256" Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.668308 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" podStartSLOduration=13.66829111 podStartE2EDuration="13.66829111s" podCreationTimestamp="2025-10-03 13:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:28.667009646 +0000 UTC m=+1417.070907481" watchObservedRunningTime="2025-10-03 13:13:28.66829111 +0000 UTC m=+1417.072188945" Oct 03 13:13:28 crc kubenswrapper[4962]: I1003 13:13:28.721093 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" podStartSLOduration=11.761932586 podStartE2EDuration="13.721079778s" podCreationTimestamp="2025-10-03 13:13:15 +0000 UTC" firstStartedPulling="2025-10-03 13:13:25.624472714 +0000 UTC m=+1414.028370549" lastFinishedPulling="2025-10-03 13:13:27.583619906 +0000 UTC m=+1415.987517741" observedRunningTime="2025-10-03 13:13:28.719946408 +0000 UTC m=+1417.123844243" watchObservedRunningTime="2025-10-03 13:13:28.721079778 +0000 UTC m=+1417.124977613" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.468227 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.616086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-swift-storage-0\") pod \"867cc47a-7f55-4690-85b7-62cc6c041b65\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.616257 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gckb\" (UniqueName: \"kubernetes.io/projected/867cc47a-7f55-4690-85b7-62cc6c041b65-kube-api-access-5gckb\") pod \"867cc47a-7f55-4690-85b7-62cc6c041b65\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.616319 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-nb\") pod \"867cc47a-7f55-4690-85b7-62cc6c041b65\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.616340 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-config\") pod \"867cc47a-7f55-4690-85b7-62cc6c041b65\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.616376 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-sb\") pod \"867cc47a-7f55-4690-85b7-62cc6c041b65\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.616413 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-svc\") pod \"867cc47a-7f55-4690-85b7-62cc6c041b65\" (UID: \"867cc47a-7f55-4690-85b7-62cc6c041b65\") " Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.627944 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867cc47a-7f55-4690-85b7-62cc6c041b65-kube-api-access-5gckb" (OuterVolumeSpecName: "kube-api-access-5gckb") pod "867cc47a-7f55-4690-85b7-62cc6c041b65" (UID: "867cc47a-7f55-4690-85b7-62cc6c041b65"). InnerVolumeSpecName "kube-api-access-5gckb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.640937 4962 generic.go:334] "Generic (PLEG): container finished" podID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" containerID="bb63da13d1a54840396a2eec391baf02ae9def0051aa74c37979642fc4fce534" exitCode=0 Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.641035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" event={"ID":"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3","Type":"ContainerDied","Data":"bb63da13d1a54840396a2eec391baf02ae9def0051aa74c37979642fc4fce534"} Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.645340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776f46cfdd-zh4jf" event={"ID":"7f8103ea-25b6-4a47-bb63-ac20d8fef59f","Type":"ContainerStarted","Data":"8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65"} Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.646092 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.647647 4962 generic.go:334] "Generic (PLEG): container finished" podID="867cc47a-7f55-4690-85b7-62cc6c041b65" containerID="3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810" exitCode=0 Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.648104 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.648245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" event={"ID":"867cc47a-7f55-4690-85b7-62cc6c041b65","Type":"ContainerDied","Data":"3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810"} Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.648267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-tptdl" event={"ID":"867cc47a-7f55-4690-85b7-62cc6c041b65","Type":"ContainerDied","Data":"b7396ed5da8bc65c53d14c3c516502267200117c6861e29f1d1ed366c6a39bbb"} Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.648282 4962 scope.go:117] "RemoveContainer" containerID="3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.688736 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-776f46cfdd-zh4jf" podStartSLOduration=3.688717219 podStartE2EDuration="3.688717219s" podCreationTimestamp="2025-10-03 13:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:29.679507742 +0000 UTC m=+1418.083405567" watchObservedRunningTime="2025-10-03 13:13:29.688717219 +0000 UTC m=+1418.092615054" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.692430 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "867cc47a-7f55-4690-85b7-62cc6c041b65" (UID: "867cc47a-7f55-4690-85b7-62cc6c041b65"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.692460 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "867cc47a-7f55-4690-85b7-62cc6c041b65" (UID: "867cc47a-7f55-4690-85b7-62cc6c041b65"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.703170 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "867cc47a-7f55-4690-85b7-62cc6c041b65" (UID: "867cc47a-7f55-4690-85b7-62cc6c041b65"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.705789 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-config" (OuterVolumeSpecName: "config") pod "867cc47a-7f55-4690-85b7-62cc6c041b65" (UID: "867cc47a-7f55-4690-85b7-62cc6c041b65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.722065 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.722094 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.722104 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.722113 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.722124 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gckb\" (UniqueName: \"kubernetes.io/projected/867cc47a-7f55-4690-85b7-62cc6c041b65-kube-api-access-5gckb\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.746158 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "867cc47a-7f55-4690-85b7-62cc6c041b65" (UID: "867cc47a-7f55-4690-85b7-62cc6c041b65"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.760448 4962 scope.go:117] "RemoveContainer" containerID="c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.803320 4962 scope.go:117] "RemoveContainer" containerID="3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810" Oct 03 13:13:29 crc kubenswrapper[4962]: E1003 13:13:29.804072 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810\": container with ID starting with 3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810 not found: ID does not exist" containerID="3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.804125 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810"} err="failed to get container status \"3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810\": rpc error: code = NotFound desc = could not find container \"3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810\": container with ID starting with 3b4f40c0b0fc6bf74fc8f274313da33e76ae4bda31b1827b3712e590554cc810 not found: ID does not exist" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.804163 4962 scope.go:117] "RemoveContainer" containerID="c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040" Oct 03 13:13:29 crc kubenswrapper[4962]: E1003 13:13:29.804456 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040\": container with ID starting with c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040 not found: ID does not exist" containerID="c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.804485 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040"} err="failed to get container status \"c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040\": rpc error: code = NotFound desc = could not find container \"c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040\": container with ID starting with c291cf9c8e87c37616072d4e4abff61e656fa44b028e61ae1b1130d865744040 not found: ID does not exist" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.824050 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/867cc47a-7f55-4690-85b7-62cc6c041b65-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.979308 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-tptdl"] Oct 03 13:13:29 crc kubenswrapper[4962]: I1003 13:13:29.987147 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-tptdl"] Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.079742 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5f745c6cff-9rkw7"] Oct 03 13:13:30 crc kubenswrapper[4962]: E1003 13:13:30.080219 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867cc47a-7f55-4690-85b7-62cc6c041b65" containerName="init" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.080244 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="867cc47a-7f55-4690-85b7-62cc6c041b65" containerName="init" Oct 03 13:13:30 crc kubenswrapper[4962]: E1003 13:13:30.080268 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867cc47a-7f55-4690-85b7-62cc6c041b65" containerName="dnsmasq-dns" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.080276 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="867cc47a-7f55-4690-85b7-62cc6c041b65" containerName="dnsmasq-dns" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.080624 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="867cc47a-7f55-4690-85b7-62cc6c041b65" containerName="dnsmasq-dns" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.081901 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.090805 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f745c6cff-9rkw7"] Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.097813 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.098676 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.233455 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-combined-ca-bundle\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.233504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-internal-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.233535 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-config\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.233552 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm8xw\" (UniqueName: \"kubernetes.io/projected/40dc7e17-4436-4452-a266-65d57a67779d-kube-api-access-dm8xw\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.233569 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-httpd-config\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.233654 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-public-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.233774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-ovndb-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.238946 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867cc47a-7f55-4690-85b7-62cc6c041b65" path="/var/lib/kubelet/pods/867cc47a-7f55-4690-85b7-62cc6c041b65/volumes" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.335674 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-internal-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.335760 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-config\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.335811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm8xw\" (UniqueName: \"kubernetes.io/projected/40dc7e17-4436-4452-a266-65d57a67779d-kube-api-access-dm8xw\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.335838 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-httpd-config\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.335926 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-public-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.335949 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-ovndb-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.336014 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-combined-ca-bundle\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.340990 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-ovndb-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.341130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-httpd-config\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.341310 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-config\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.344410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-internal-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.344805 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-combined-ca-bundle\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.347318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-public-tls-certs\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.352077 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm8xw\" (UniqueName: \"kubernetes.io/projected/40dc7e17-4436-4452-a266-65d57a67779d-kube-api-access-dm8xw\") pod \"neutron-5f745c6cff-9rkw7\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.428368 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.672388 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" event={"ID":"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3","Type":"ContainerStarted","Data":"f749546d6477023e3f3a7cd7784bffbbe59ed5dc3858698a822779923a8a22eb"} Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.673481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.677199 4962 generic.go:334] "Generic (PLEG): container finished" podID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerID="4fc87af0adc8de378b2e85921a90879f9ab1b63fd92d450de2af297793e6c95b" exitCode=0 Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.677260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e1e8d-9d15-4eef-a553-2af7c59998e3","Type":"ContainerDied","Data":"4fc87af0adc8de378b2e85921a90879f9ab1b63fd92d450de2af297793e6c95b"} Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.703843 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" podStartSLOduration=4.703822564 podStartE2EDuration="4.703822564s" podCreationTimestamp="2025-10-03 13:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:30.693912198 +0000 UTC m=+1419.097810043" watchObservedRunningTime="2025-10-03 13:13:30.703822564 +0000 UTC m=+1419.107720389" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.834919 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.954498 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-combined-ca-bundle\") pod \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.954629 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-log-httpd\") pod \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.954732 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-scripts\") pod \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.954767 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngmnt\" (UniqueName: \"kubernetes.io/projected/c09e1e8d-9d15-4eef-a553-2af7c59998e3-kube-api-access-ngmnt\") pod \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.954941 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-config-data\") pod \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.954970 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-run-httpd\") pod \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.955002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-sg-core-conf-yaml\") pod \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\" (UID: \"c09e1e8d-9d15-4eef-a553-2af7c59998e3\") " Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.955147 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c09e1e8d-9d15-4eef-a553-2af7c59998e3" (UID: "c09e1e8d-9d15-4eef-a553-2af7c59998e3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.955342 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c09e1e8d-9d15-4eef-a553-2af7c59998e3" (UID: "c09e1e8d-9d15-4eef-a553-2af7c59998e3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.955624 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.955664 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e1e8d-9d15-4eef-a553-2af7c59998e3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.959131 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09e1e8d-9d15-4eef-a553-2af7c59998e3-kube-api-access-ngmnt" (OuterVolumeSpecName: "kube-api-access-ngmnt") pod "c09e1e8d-9d15-4eef-a553-2af7c59998e3" (UID: "c09e1e8d-9d15-4eef-a553-2af7c59998e3"). InnerVolumeSpecName "kube-api-access-ngmnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.959716 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-scripts" (OuterVolumeSpecName: "scripts") pod "c09e1e8d-9d15-4eef-a553-2af7c59998e3" (UID: "c09e1e8d-9d15-4eef-a553-2af7c59998e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:30 crc kubenswrapper[4962]: I1003 13:13:30.982750 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c09e1e8d-9d15-4eef-a553-2af7c59998e3" (UID: "c09e1e8d-9d15-4eef-a553-2af7c59998e3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.010031 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c09e1e8d-9d15-4eef-a553-2af7c59998e3" (UID: "c09e1e8d-9d15-4eef-a553-2af7c59998e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.015142 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-config-data" (OuterVolumeSpecName: "config-data") pod "c09e1e8d-9d15-4eef-a553-2af7c59998e3" (UID: "c09e1e8d-9d15-4eef-a553-2af7c59998e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.057208 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.057253 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.057268 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.057281 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e1e8d-9d15-4eef-a553-2af7c59998e3-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.057293 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngmnt\" (UniqueName: \"kubernetes.io/projected/c09e1e8d-9d15-4eef-a553-2af7c59998e3-kube-api-access-ngmnt\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.092168 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f745c6cff-9rkw7"] Oct 03 13:13:31 crc kubenswrapper[4962]: W1003 13:13:31.092579 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40dc7e17_4436_4452_a266_65d57a67779d.slice/crio-0b998c5e68a1ddb1aa3aa627a98d80ddd737d19e65fa0642ea22a117e443ffec WatchSource:0}: Error finding container 0b998c5e68a1ddb1aa3aa627a98d80ddd737d19e65fa0642ea22a117e443ffec: Status 404 returned error can't find the container with id 0b998c5e68a1ddb1aa3aa627a98d80ddd737d19e65fa0642ea22a117e443ffec Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.396769 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.397023 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.504277 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.695726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e1e8d-9d15-4eef-a553-2af7c59998e3","Type":"ContainerDied","Data":"90d5adf3763d27a649c2af61fad8cebac503244abf7c1c477777b6d017c259f7"} Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.695783 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.696956 4962 scope.go:117] "RemoveContainer" containerID="2a9a5082cb69eea6f1efba2560d7c088a1572ce55dbf4bcf4ca90d2ed0560744" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.709692 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f745c6cff-9rkw7" event={"ID":"40dc7e17-4436-4452-a266-65d57a67779d","Type":"ContainerStarted","Data":"5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff"} Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.710006 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f745c6cff-9rkw7" event={"ID":"40dc7e17-4436-4452-a266-65d57a67779d","Type":"ContainerStarted","Data":"262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c"} Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.710099 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f745c6cff-9rkw7" event={"ID":"40dc7e17-4436-4452-a266-65d57a67779d","Type":"ContainerStarted","Data":"0b998c5e68a1ddb1aa3aa627a98d80ddd737d19e65fa0642ea22a117e443ffec"} Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.710978 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.733179 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5f745c6cff-9rkw7" podStartSLOduration=1.733160872 podStartE2EDuration="1.733160872s" podCreationTimestamp="2025-10-03 13:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:31.725056265 +0000 UTC m=+1420.128954100" watchObservedRunningTime="2025-10-03 13:13:31.733160872 +0000 UTC m=+1420.137058707" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.740453 4962 scope.go:117] "RemoveContainer" containerID="4fc87af0adc8de378b2e85921a90879f9ab1b63fd92d450de2af297793e6c95b" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.774877 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.796870 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.799511 4962 scope.go:117] "RemoveContainer" containerID="922457fe5519a12a1f756b9f0601dc6bb170b210e8c7fcb0a9bfa22bb5817803" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.805889 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:31 crc kubenswrapper[4962]: E1003 13:13:31.806440 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="ceilometer-central-agent" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.806515 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="ceilometer-central-agent" Oct 03 13:13:31 crc kubenswrapper[4962]: E1003 13:13:31.806602 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="ceilometer-notification-agent" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.806703 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="ceilometer-notification-agent" Oct 03 13:13:31 crc kubenswrapper[4962]: E1003 13:13:31.806782 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="sg-core" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.806838 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="sg-core" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.807064 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="ceilometer-notification-agent" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.807138 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="sg-core" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.807202 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" containerName="ceilometer-central-agent" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.808864 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.812145 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.816137 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.816351 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.816357 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.874814 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjqs\" (UniqueName: \"kubernetes.io/projected/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-kube-api-access-ppjqs\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.874868 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.874950 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-config-data\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.874994 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.875685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-scripts\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.875731 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-log-httpd\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.875780 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-run-httpd\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.892727 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkrxj"] Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977373 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-log-httpd\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977425 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-run-httpd\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977444 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977460 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjqs\" (UniqueName: \"kubernetes.io/projected/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-kube-api-access-ppjqs\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-config-data\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977559 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-scripts\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977876 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-log-httpd\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.977885 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-run-httpd\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.981396 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.982382 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-config-data\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.984421 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.997281 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-scripts\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:31 crc kubenswrapper[4962]: I1003 13:13:31.997293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjqs\" (UniqueName: \"kubernetes.io/projected/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-kube-api-access-ppjqs\") pod \"ceilometer-0\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " pod="openstack/ceilometer-0" Oct 03 13:13:32 crc kubenswrapper[4962]: I1003 13:13:32.009153 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:32 crc kubenswrapper[4962]: I1003 13:13:32.143028 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:32 crc kubenswrapper[4962]: I1003 13:13:32.256402 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09e1e8d-9d15-4eef-a553-2af7c59998e3" path="/var/lib/kubelet/pods/c09e1e8d-9d15-4eef-a553-2af7c59998e3/volumes" Oct 03 13:13:32 crc kubenswrapper[4962]: I1003 13:13:32.697171 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:32 crc kubenswrapper[4962]: I1003 13:13:32.721619 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerStarted","Data":"7f6290c1d51a607b6c0b982714cd1e71d91f192bd0c580ecd604fbb755b5fb0e"} Oct 03 13:13:33 crc kubenswrapper[4962]: I1003 13:13:33.110765 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:33 crc kubenswrapper[4962]: I1003 13:13:33.732947 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xkrxj" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerName="registry-server" containerID="cri-o://bb04299e38f0a1f2e1cc9bde34ef2449edd45207ce23d8c93e574a50d6c31462" gracePeriod=2 Oct 03 13:13:33 crc kubenswrapper[4962]: I1003 13:13:33.803296 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:33 crc kubenswrapper[4962]: I1003 13:13:33.946132 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:13:33 crc kubenswrapper[4962]: I1003 13:13:33.995324 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dd6cbcb6d-pxvpn"] Oct 03 13:13:34 crc kubenswrapper[4962]: I1003 13:13:34.748313 4962 generic.go:334] "Generic (PLEG): container finished" podID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerID="bb04299e38f0a1f2e1cc9bde34ef2449edd45207ce23d8c93e574a50d6c31462" exitCode=0 Oct 03 13:13:34 crc kubenswrapper[4962]: I1003 13:13:34.748494 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkrxj" event={"ID":"f21bb1eb-c956-4d3f-b7e3-199749af43df","Type":"ContainerDied","Data":"bb04299e38f0a1f2e1cc9bde34ef2449edd45207ce23d8c93e574a50d6c31462"} Oct 03 13:13:34 crc kubenswrapper[4962]: I1003 13:13:34.748764 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api-log" containerID="cri-o://4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310" gracePeriod=30 Oct 03 13:13:34 crc kubenswrapper[4962]: I1003 13:13:34.748849 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api" containerID="cri-o://eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a" gracePeriod=30 Oct 03 13:13:34 crc kubenswrapper[4962]: I1003 13:13:34.753287 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.151:9311/healthcheck\": EOF" Oct 03 13:13:34 crc kubenswrapper[4962]: I1003 13:13:34.753332 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.151:9311/healthcheck\": EOF" Oct 03 13:13:34 crc kubenswrapper[4962]: I1003 13:13:34.753454 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.151:9311/healthcheck\": EOF" Oct 03 13:13:35 crc kubenswrapper[4962]: I1003 13:13:35.782807 4962 generic.go:334] "Generic (PLEG): container finished" podID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerID="4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310" exitCode=143 Oct 03 13:13:35 crc kubenswrapper[4962]: I1003 13:13:35.783153 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" event={"ID":"a730dfa1-b919-4d19-bf40-06e5c3348d6f","Type":"ContainerDied","Data":"4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310"} Oct 03 13:13:36 crc kubenswrapper[4962]: I1003 13:13:36.708875 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:36 crc kubenswrapper[4962]: I1003 13:13:36.998677 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.043814 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.091226 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-utilities\") pod \"f21bb1eb-c956-4d3f-b7e3-199749af43df\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.091736 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-catalog-content\") pod \"f21bb1eb-c956-4d3f-b7e3-199749af43df\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.091890 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx7rl\" (UniqueName: \"kubernetes.io/projected/f21bb1eb-c956-4d3f-b7e3-199749af43df-kube-api-access-bx7rl\") pod \"f21bb1eb-c956-4d3f-b7e3-199749af43df\" (UID: \"f21bb1eb-c956-4d3f-b7e3-199749af43df\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.091960 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-utilities" (OuterVolumeSpecName: "utilities") pod "f21bb1eb-c956-4d3f-b7e3-199749af43df" (UID: "f21bb1eb-c956-4d3f-b7e3-199749af43df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.092584 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.099837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21bb1eb-c956-4d3f-b7e3-199749af43df-kube-api-access-bx7rl" (OuterVolumeSpecName: "kube-api-access-bx7rl") pod "f21bb1eb-c956-4d3f-b7e3-199749af43df" (UID: "f21bb1eb-c956-4d3f-b7e3-199749af43df"). InnerVolumeSpecName "kube-api-access-bx7rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.109724 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f21bb1eb-c956-4d3f-b7e3-199749af43df" (UID: "f21bb1eb-c956-4d3f-b7e3-199749af43df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.133832 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.193757 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx7rl\" (UniqueName: \"kubernetes.io/projected/f21bb1eb-c956-4d3f-b7e3-199749af43df-kube-api-access-bx7rl\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.193789 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21bb1eb-c956-4d3f-b7e3-199749af43df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.216501 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rk29c"] Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.216953 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" podUID="ed39cf02-e47a-4dc8-be15-377a11c21af5" containerName="dnsmasq-dns" containerID="cri-o://76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3" gracePeriod=10 Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.274083 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 13:13:37 crc kubenswrapper[4962]: E1003 13:13:37.274557 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerName="registry-server" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.274577 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerName="registry-server" Oct 03 13:13:37 crc kubenswrapper[4962]: E1003 13:13:37.274608 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerName="extract-utilities" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.274618 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerName="extract-utilities" Oct 03 13:13:37 crc kubenswrapper[4962]: E1003 13:13:37.274634 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerName="extract-content" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.274686 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerName="extract-content" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.274895 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" containerName="registry-server" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.275528 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.278061 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.278542 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.278741 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8f9tx" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.283117 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.396924 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.396988 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.397039 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9dzd\" (UniqueName: \"kubernetes.io/projected/75210a15-c36f-4be9-9709-ceb4eb2c4646-kube-api-access-r9dzd\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.397065 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config-secret\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.498284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.498341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.498380 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9dzd\" (UniqueName: \"kubernetes.io/projected/75210a15-c36f-4be9-9709-ceb4eb2c4646-kube-api-access-r9dzd\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.498398 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config-secret\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.501188 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.503239 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config-secret\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.508258 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.517619 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9dzd\" (UniqueName: \"kubernetes.io/projected/75210a15-c36f-4be9-9709-ceb4eb2c4646-kube-api-access-r9dzd\") pod \"openstackclient\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.627443 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.648928 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.707088 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cg98\" (UniqueName: \"kubernetes.io/projected/ed39cf02-e47a-4dc8-be15-377a11c21af5-kube-api-access-7cg98\") pod \"ed39cf02-e47a-4dc8-be15-377a11c21af5\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.707286 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-sb\") pod \"ed39cf02-e47a-4dc8-be15-377a11c21af5\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.707329 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-nb\") pod \"ed39cf02-e47a-4dc8-be15-377a11c21af5\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.707367 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-svc\") pod \"ed39cf02-e47a-4dc8-be15-377a11c21af5\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.707451 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-config\") pod \"ed39cf02-e47a-4dc8-be15-377a11c21af5\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.707532 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-swift-storage-0\") pod \"ed39cf02-e47a-4dc8-be15-377a11c21af5\" (UID: \"ed39cf02-e47a-4dc8-be15-377a11c21af5\") " Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.716865 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed39cf02-e47a-4dc8-be15-377a11c21af5-kube-api-access-7cg98" (OuterVolumeSpecName: "kube-api-access-7cg98") pod "ed39cf02-e47a-4dc8-be15-377a11c21af5" (UID: "ed39cf02-e47a-4dc8-be15-377a11c21af5"). InnerVolumeSpecName "kube-api-access-7cg98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.751683 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed39cf02-e47a-4dc8-be15-377a11c21af5" (UID: "ed39cf02-e47a-4dc8-be15-377a11c21af5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.772805 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.782508 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed39cf02-e47a-4dc8-be15-377a11c21af5" (UID: "ed39cf02-e47a-4dc8-be15-377a11c21af5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.799370 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed39cf02-e47a-4dc8-be15-377a11c21af5" (UID: "ed39cf02-e47a-4dc8-be15-377a11c21af5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.800034 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed39cf02-e47a-4dc8-be15-377a11c21af5" (UID: "ed39cf02-e47a-4dc8-be15-377a11c21af5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.809549 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.809593 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.809603 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.809613 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cg98\" (UniqueName: \"kubernetes.io/projected/ed39cf02-e47a-4dc8-be15-377a11c21af5-kube-api-access-7cg98\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.809624 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.901928 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-config" (OuterVolumeSpecName: "config") pod "ed39cf02-e47a-4dc8-be15-377a11c21af5" (UID: "ed39cf02-e47a-4dc8-be15-377a11c21af5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.914296 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed39cf02-e47a-4dc8-be15-377a11c21af5-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.935951 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkrxj" event={"ID":"f21bb1eb-c956-4d3f-b7e3-199749af43df","Type":"ContainerDied","Data":"279da3bc728492249f018561caed8f3cbeae5de438315df12f0b13cadb99712b"} Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.936007 4962 scope.go:117] "RemoveContainer" containerID="bb04299e38f0a1f2e1cc9bde34ef2449edd45207ce23d8c93e574a50d6c31462" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.936139 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkrxj" Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.964013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerStarted","Data":"675cb5bcbe617216f33c8ef6b007fb7933f4700a66e18156cf26a7b0e4a70ab1"} Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.964052 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerStarted","Data":"3d8e38eef106a8fcdec0389cc2f4baa1a812949d416fb48c33991e1d4e85999d"} Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.967705 4962 generic.go:334] "Generic (PLEG): container finished" podID="ed39cf02-e47a-4dc8-be15-377a11c21af5" containerID="76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3" exitCode=0 Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.967836 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" event={"ID":"ed39cf02-e47a-4dc8-be15-377a11c21af5","Type":"ContainerDied","Data":"76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3"} Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.967904 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" event={"ID":"ed39cf02-e47a-4dc8-be15-377a11c21af5","Type":"ContainerDied","Data":"dcd85521f835fc743f3b0e52f3e2d4c47be69b7061996bd963ddf98274719513"} Oct 03 13:13:37 crc kubenswrapper[4962]: I1003 13:13:37.968889 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-rk29c" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.054444 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkrxj"] Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.061283 4962 scope.go:117] "RemoveContainer" containerID="36c1e5d4cc9238450eba3e5fd7117e9e2030bf97281706b1d15b2472c64fffc7" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.061734 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkrxj"] Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.079877 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rk29c"] Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.085586 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-rk29c"] Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.108527 4962 scope.go:117] "RemoveContainer" containerID="78a1528336a19d142eec74acabd6b6c369b5cc5054582317455a473a9ae7c67f" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.139071 4962 scope.go:117] "RemoveContainer" containerID="76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.176846 4962 scope.go:117] "RemoveContainer" containerID="f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.205609 4962 scope.go:117] "RemoveContainer" containerID="76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3" Oct 03 13:13:38 crc kubenswrapper[4962]: E1003 13:13:38.206965 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3\": container with ID starting with 76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3 not found: ID does not exist" containerID="76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.207018 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3"} err="failed to get container status \"76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3\": rpc error: code = NotFound desc = could not find container \"76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3\": container with ID starting with 76eff9f4b08df1c6d695ba631a1e0fd943839d60bff13aa04fad80eb794b97c3 not found: ID does not exist" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.207046 4962 scope.go:117] "RemoveContainer" containerID="f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a" Oct 03 13:13:38 crc kubenswrapper[4962]: E1003 13:13:38.207512 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a\": container with ID starting with f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a not found: ID does not exist" containerID="f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.207551 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a"} err="failed to get container status \"f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a\": rpc error: code = NotFound desc = could not find container \"f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a\": container with ID starting with f0bc21065c1bfd2b354ddca2563e794bb809399b532515e039e718a2597cb79a not found: ID does not exist" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.241387 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed39cf02-e47a-4dc8-be15-377a11c21af5" path="/var/lib/kubelet/pods/ed39cf02-e47a-4dc8-be15-377a11c21af5/volumes" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.242202 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21bb1eb-c956-4d3f-b7e3-199749af43df" path="/var/lib/kubelet/pods/f21bb1eb-c956-4d3f-b7e3-199749af43df/volumes" Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.279089 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 13:13:38 crc kubenswrapper[4962]: W1003 13:13:38.296952 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75210a15_c36f_4be9_9709_ceb4eb2c4646.slice/crio-d99a898801303149b91fe9b62a907dc37c5724eb93046c92f9efc04c0b54a7b0 WatchSource:0}: Error finding container d99a898801303149b91fe9b62a907dc37c5724eb93046c92f9efc04c0b54a7b0: Status 404 returned error can't find the container with id d99a898801303149b91fe9b62a907dc37c5724eb93046c92f9efc04c0b54a7b0 Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.982923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerStarted","Data":"8e2690e91a3f7bc521a41ae68cf943b94cdba77402b2a1f81df19b4b2e4fc75d"} Oct 03 13:13:38 crc kubenswrapper[4962]: I1003 13:13:38.984281 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"75210a15-c36f-4be9-9709-ceb4eb2c4646","Type":"ContainerStarted","Data":"d99a898801303149b91fe9b62a907dc37c5724eb93046c92f9efc04c0b54a7b0"} Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.192692 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.151:9311/healthcheck\": read tcp 10.217.0.2:41842->10.217.0.151:9311: read: connection reset by peer" Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.192761 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.151:9311/healthcheck\": read tcp 10.217.0.2:41854->10.217.0.151:9311: read: connection reset by peer" Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.844080 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.958255 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfbrl\" (UniqueName: \"kubernetes.io/projected/a730dfa1-b919-4d19-bf40-06e5c3348d6f-kube-api-access-qfbrl\") pod \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.958394 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-combined-ca-bundle\") pod \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.958467 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a730dfa1-b919-4d19-bf40-06e5c3348d6f-logs\") pod \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.958491 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data-custom\") pod \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.958556 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data\") pod \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\" (UID: \"a730dfa1-b919-4d19-bf40-06e5c3348d6f\") " Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.961522 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a730dfa1-b919-4d19-bf40-06e5c3348d6f-logs" (OuterVolumeSpecName: "logs") pod "a730dfa1-b919-4d19-bf40-06e5c3348d6f" (UID: "a730dfa1-b919-4d19-bf40-06e5c3348d6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.964876 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a730dfa1-b919-4d19-bf40-06e5c3348d6f-kube-api-access-qfbrl" (OuterVolumeSpecName: "kube-api-access-qfbrl") pod "a730dfa1-b919-4d19-bf40-06e5c3348d6f" (UID: "a730dfa1-b919-4d19-bf40-06e5c3348d6f"). InnerVolumeSpecName "kube-api-access-qfbrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:39 crc kubenswrapper[4962]: I1003 13:13:39.978773 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a730dfa1-b919-4d19-bf40-06e5c3348d6f" (UID: "a730dfa1-b919-4d19-bf40-06e5c3348d6f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.008865 4962 generic.go:334] "Generic (PLEG): container finished" podID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerID="eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a" exitCode=0 Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.008950 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" event={"ID":"a730dfa1-b919-4d19-bf40-06e5c3348d6f","Type":"ContainerDied","Data":"eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a"} Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.008983 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" event={"ID":"a730dfa1-b919-4d19-bf40-06e5c3348d6f","Type":"ContainerDied","Data":"d7988da2603c8d892881edf776679c2cc085608d63b8fa9f20a502f329002d31"} Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.009004 4962 scope.go:117] "RemoveContainer" containerID="eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.009166 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd6cbcb6d-pxvpn" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.046410 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerStarted","Data":"3d886faa85a7dc9131c92ab0f655a7565cf4b5da93977cedbba1f4a02da19850"} Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.046744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data" (OuterVolumeSpecName: "config-data") pod "a730dfa1-b919-4d19-bf40-06e5c3348d6f" (UID: "a730dfa1-b919-4d19-bf40-06e5c3348d6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.046831 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.060150 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfbrl\" (UniqueName: \"kubernetes.io/projected/a730dfa1-b919-4d19-bf40-06e5c3348d6f-kube-api-access-qfbrl\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.060176 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a730dfa1-b919-4d19-bf40-06e5c3348d6f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.060188 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.060195 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.067680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a730dfa1-b919-4d19-bf40-06e5c3348d6f" (UID: "a730dfa1-b919-4d19-bf40-06e5c3348d6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.086257 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.073404671 podStartE2EDuration="9.086240948s" podCreationTimestamp="2025-10-03 13:13:31 +0000 UTC" firstStartedPulling="2025-10-03 13:13:32.68728269 +0000 UTC m=+1421.091180525" lastFinishedPulling="2025-10-03 13:13:39.700118967 +0000 UTC m=+1428.104016802" observedRunningTime="2025-10-03 13:13:40.08147789 +0000 UTC m=+1428.485375725" watchObservedRunningTime="2025-10-03 13:13:40.086240948 +0000 UTC m=+1428.490138783" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.100821 4962 scope.go:117] "RemoveContainer" containerID="4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.126822 4962 scope.go:117] "RemoveContainer" containerID="eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a" Oct 03 13:13:40 crc kubenswrapper[4962]: E1003 13:13:40.127960 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a\": container with ID starting with eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a not found: ID does not exist" containerID="eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.128002 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a"} err="failed to get container status \"eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a\": rpc error: code = NotFound desc = could not find container \"eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a\": container with ID starting with eabc1fa4c9169f5d2da3a9a307dd00c6b2412305ec1e525e91626a1ffeba387a not found: ID does not exist" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.128027 4962 scope.go:117] "RemoveContainer" containerID="4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310" Oct 03 13:13:40 crc kubenswrapper[4962]: E1003 13:13:40.128304 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310\": container with ID starting with 4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310 not found: ID does not exist" containerID="4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.128325 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310"} err="failed to get container status \"4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310\": rpc error: code = NotFound desc = could not find container \"4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310\": container with ID starting with 4915690674fc094990f631fd848dd773d1a175b026044425c0ec97fc8c791310 not found: ID does not exist" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.162002 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a730dfa1-b919-4d19-bf40-06e5c3348d6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.336054 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dd6cbcb6d-pxvpn"] Oct 03 13:13:40 crc kubenswrapper[4962]: I1003 13:13:40.350441 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6dd6cbcb6d-pxvpn"] Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.073133 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kz6b9" event={"ID":"866c8e6b-3fdd-442c-98d4-cf44b6ef098c","Type":"ContainerStarted","Data":"bfa078c92cf50627cb21a7528cdd93759094f41cfe07bc834b04b7d668a8b374"} Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.090904 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-kz6b9" podStartSLOduration=10.157242704 podStartE2EDuration="44.090886243s" podCreationTimestamp="2025-10-03 13:12:58 +0000 UTC" firstStartedPulling="2025-10-03 13:13:06.800759429 +0000 UTC m=+1395.204657264" lastFinishedPulling="2025-10-03 13:13:40.734402968 +0000 UTC m=+1429.138300803" observedRunningTime="2025-10-03 13:13:42.090226145 +0000 UTC m=+1430.494123990" watchObservedRunningTime="2025-10-03 13:13:42.090886243 +0000 UTC m=+1430.494784078" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.239437 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" path="/var/lib/kubelet/pods/a730dfa1-b919-4d19-bf40-06e5c3348d6f/volumes" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.589358 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-68b6c975-4cb8j"] Oct 03 13:13:42 crc kubenswrapper[4962]: E1003 13:13:42.589761 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api-log" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.589783 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api-log" Oct 03 13:13:42 crc kubenswrapper[4962]: E1003 13:13:42.589798 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.589805 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api" Oct 03 13:13:42 crc kubenswrapper[4962]: E1003 13:13:42.589835 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed39cf02-e47a-4dc8-be15-377a11c21af5" containerName="init" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.589841 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed39cf02-e47a-4dc8-be15-377a11c21af5" containerName="init" Oct 03 13:13:42 crc kubenswrapper[4962]: E1003 13:13:42.589854 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed39cf02-e47a-4dc8-be15-377a11c21af5" containerName="dnsmasq-dns" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.589861 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed39cf02-e47a-4dc8-be15-377a11c21af5" containerName="dnsmasq-dns" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.590031 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api-log" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.590057 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a730dfa1-b919-4d19-bf40-06e5c3348d6f" containerName="barbican-api" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.590071 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed39cf02-e47a-4dc8-be15-377a11c21af5" containerName="dnsmasq-dns" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.591177 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.594169 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.594392 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.594580 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.639756 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-68b6c975-4cb8j"] Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.712761 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-config-data\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.712831 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzmlv\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-kube-api-access-wzmlv\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.712884 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-etc-swift\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.712921 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-combined-ca-bundle\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.713038 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-run-httpd\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.713092 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-public-tls-certs\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.713119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-log-httpd\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.713157 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-internal-tls-certs\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.815024 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-internal-tls-certs\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.815089 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-config-data\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.815134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzmlv\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-kube-api-access-wzmlv\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.815169 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-etc-swift\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.815198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-combined-ca-bundle\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.815348 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-run-httpd\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.816216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-run-httpd\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.816583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-public-tls-certs\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.816615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-log-httpd\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.816956 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-log-httpd\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.822728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-internal-tls-certs\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.825521 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-public-tls-certs\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.825633 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-etc-swift\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.828810 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-combined-ca-bundle\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.832321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzmlv\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-kube-api-access-wzmlv\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.840515 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-config-data\") pod \"swift-proxy-68b6c975-4cb8j\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:42 crc kubenswrapper[4962]: I1003 13:13:42.918690 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:43 crc kubenswrapper[4962]: I1003 13:13:43.477706 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-68b6c975-4cb8j"] Oct 03 13:13:45 crc kubenswrapper[4962]: I1003 13:13:45.089581 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:45 crc kubenswrapper[4962]: I1003 13:13:45.090335 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="proxy-httpd" containerID="cri-o://3d886faa85a7dc9131c92ab0f655a7565cf4b5da93977cedbba1f4a02da19850" gracePeriod=30 Oct 03 13:13:45 crc kubenswrapper[4962]: I1003 13:13:45.090483 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="sg-core" containerID="cri-o://8e2690e91a3f7bc521a41ae68cf943b94cdba77402b2a1f81df19b4b2e4fc75d" gracePeriod=30 Oct 03 13:13:45 crc kubenswrapper[4962]: I1003 13:13:45.090534 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="ceilometer-notification-agent" containerID="cri-o://675cb5bcbe617216f33c8ef6b007fb7933f4700a66e18156cf26a7b0e4a70ab1" gracePeriod=30 Oct 03 13:13:45 crc kubenswrapper[4962]: I1003 13:13:45.090253 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="ceilometer-central-agent" containerID="cri-o://3d8e38eef106a8fcdec0389cc2f4baa1a812949d416fb48c33991e1d4e85999d" gracePeriod=30 Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.061360 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7xjlw"] Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.062904 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7xjlw" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.075671 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7xjlw"] Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.122376 4962 generic.go:334] "Generic (PLEG): container finished" podID="866c8e6b-3fdd-442c-98d4-cf44b6ef098c" containerID="bfa078c92cf50627cb21a7528cdd93759094f41cfe07bc834b04b7d668a8b374" exitCode=0 Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.122436 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kz6b9" event={"ID":"866c8e6b-3fdd-442c-98d4-cf44b6ef098c","Type":"ContainerDied","Data":"bfa078c92cf50627cb21a7528cdd93759094f41cfe07bc834b04b7d668a8b374"} Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.128901 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerID="3d886faa85a7dc9131c92ab0f655a7565cf4b5da93977cedbba1f4a02da19850" exitCode=0 Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.128935 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerID="8e2690e91a3f7bc521a41ae68cf943b94cdba77402b2a1f81df19b4b2e4fc75d" exitCode=2 Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.128946 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerID="675cb5bcbe617216f33c8ef6b007fb7933f4700a66e18156cf26a7b0e4a70ab1" exitCode=0 Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.128955 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerID="3d8e38eef106a8fcdec0389cc2f4baa1a812949d416fb48c33991e1d4e85999d" exitCode=0 Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.128978 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerDied","Data":"3d886faa85a7dc9131c92ab0f655a7565cf4b5da93977cedbba1f4a02da19850"} Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.129004 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerDied","Data":"8e2690e91a3f7bc521a41ae68cf943b94cdba77402b2a1f81df19b4b2e4fc75d"} Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.129022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerDied","Data":"675cb5bcbe617216f33c8ef6b007fb7933f4700a66e18156cf26a7b0e4a70ab1"} Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.129034 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerDied","Data":"3d8e38eef106a8fcdec0389cc2f4baa1a812949d416fb48c33991e1d4e85999d"} Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.167539 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9lwfl"] Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.168613 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9lwfl" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.178940 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9lwfl"] Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.192285 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnqgs\" (UniqueName: \"kubernetes.io/projected/164dbb03-b802-46d3-8dbd-1b6dc90cda51-kube-api-access-cnqgs\") pod \"nova-api-db-create-7xjlw\" (UID: \"164dbb03-b802-46d3-8dbd-1b6dc90cda51\") " pod="openstack/nova-api-db-create-7xjlw" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.293587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnqgs\" (UniqueName: \"kubernetes.io/projected/164dbb03-b802-46d3-8dbd-1b6dc90cda51-kube-api-access-cnqgs\") pod \"nova-api-db-create-7xjlw\" (UID: \"164dbb03-b802-46d3-8dbd-1b6dc90cda51\") " pod="openstack/nova-api-db-create-7xjlw" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.293720 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxzhb\" (UniqueName: \"kubernetes.io/projected/0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b-kube-api-access-vxzhb\") pod \"nova-cell0-db-create-9lwfl\" (UID: \"0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b\") " pod="openstack/nova-cell0-db-create-9lwfl" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.311167 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnqgs\" (UniqueName: \"kubernetes.io/projected/164dbb03-b802-46d3-8dbd-1b6dc90cda51-kube-api-access-cnqgs\") pod \"nova-api-db-create-7xjlw\" (UID: \"164dbb03-b802-46d3-8dbd-1b6dc90cda51\") " pod="openstack/nova-api-db-create-7xjlw" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.381260 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7xjlw" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.398256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxzhb\" (UniqueName: \"kubernetes.io/projected/0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b-kube-api-access-vxzhb\") pod \"nova-cell0-db-create-9lwfl\" (UID: \"0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b\") " pod="openstack/nova-cell0-db-create-9lwfl" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.403123 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-djwxp"] Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.407744 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-djwxp" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.426851 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-djwxp"] Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.428944 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxzhb\" (UniqueName: \"kubernetes.io/projected/0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b-kube-api-access-vxzhb\") pod \"nova-cell0-db-create-9lwfl\" (UID: \"0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b\") " pod="openstack/nova-cell0-db-create-9lwfl" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.487690 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9lwfl" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.501678 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj4lf\" (UniqueName: \"kubernetes.io/projected/61cfe20f-97f2-444b-9a56-00a3c22d7ba7-kube-api-access-nj4lf\") pod \"nova-cell1-db-create-djwxp\" (UID: \"61cfe20f-97f2-444b-9a56-00a3c22d7ba7\") " pod="openstack/nova-cell1-db-create-djwxp" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.602730 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj4lf\" (UniqueName: \"kubernetes.io/projected/61cfe20f-97f2-444b-9a56-00a3c22d7ba7-kube-api-access-nj4lf\") pod \"nova-cell1-db-create-djwxp\" (UID: \"61cfe20f-97f2-444b-9a56-00a3c22d7ba7\") " pod="openstack/nova-cell1-db-create-djwxp" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.626841 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj4lf\" (UniqueName: \"kubernetes.io/projected/61cfe20f-97f2-444b-9a56-00a3c22d7ba7-kube-api-access-nj4lf\") pod \"nova-cell1-db-create-djwxp\" (UID: \"61cfe20f-97f2-444b-9a56-00a3c22d7ba7\") " pod="openstack/nova-cell1-db-create-djwxp" Oct 03 13:13:46 crc kubenswrapper[4962]: I1003 13:13:46.776429 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-djwxp" Oct 03 13:13:48 crc kubenswrapper[4962]: W1003 13:13:48.601714 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05f2e935_e9b5_49ab_8a2a_30b15840bae9.slice/crio-cac01411963bfa7be3e10db04acc0c3eea2c2ef096b36b436336abeb75d82df1 WatchSource:0}: Error finding container cac01411963bfa7be3e10db04acc0c3eea2c2ef096b36b436336abeb75d82df1: Status 404 returned error can't find the container with id cac01411963bfa7be3e10db04acc0c3eea2c2ef096b36b436336abeb75d82df1 Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.798818 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.947161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-db-sync-config-data\") pod \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.947626 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-combined-ca-bundle\") pod \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.947671 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlc5k\" (UniqueName: \"kubernetes.io/projected/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-kube-api-access-vlc5k\") pod \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.947725 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-etc-machine-id\") pod \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.947773 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-scripts\") pod \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.947814 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-config-data\") pod \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\" (UID: \"866c8e6b-3fdd-442c-98d4-cf44b6ef098c\") " Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.950020 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "866c8e6b-3fdd-442c-98d4-cf44b6ef098c" (UID: "866c8e6b-3fdd-442c-98d4-cf44b6ef098c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.964184 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-scripts" (OuterVolumeSpecName: "scripts") pod "866c8e6b-3fdd-442c-98d4-cf44b6ef098c" (UID: "866c8e6b-3fdd-442c-98d4-cf44b6ef098c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.964926 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.972860 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "866c8e6b-3fdd-442c-98d4-cf44b6ef098c" (UID: "866c8e6b-3fdd-442c-98d4-cf44b6ef098c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:48 crc kubenswrapper[4962]: I1003 13:13:48.973393 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-kube-api-access-vlc5k" (OuterVolumeSpecName: "kube-api-access-vlc5k") pod "866c8e6b-3fdd-442c-98d4-cf44b6ef098c" (UID: "866c8e6b-3fdd-442c-98d4-cf44b6ef098c"). InnerVolumeSpecName "kube-api-access-vlc5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.011352 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "866c8e6b-3fdd-442c-98d4-cf44b6ef098c" (UID: "866c8e6b-3fdd-442c-98d4-cf44b6ef098c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.040445 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-config-data" (OuterVolumeSpecName: "config-data") pod "866c8e6b-3fdd-442c-98d4-cf44b6ef098c" (UID: "866c8e6b-3fdd-442c-98d4-cf44b6ef098c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.050703 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-log-httpd\") pod \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.050747 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-combined-ca-bundle\") pod \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.050811 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppjqs\" (UniqueName: \"kubernetes.io/projected/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-kube-api-access-ppjqs\") pod \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.050829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-scripts\") pod \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.050909 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-run-httpd\") pod \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.050937 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-config-data\") pod \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.050957 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-sg-core-conf-yaml\") pod \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\" (UID: \"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe\") " Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.051283 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.051296 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlc5k\" (UniqueName: \"kubernetes.io/projected/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-kube-api-access-vlc5k\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.051381 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.051411 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.051419 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.051427 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/866c8e6b-3fdd-442c-98d4-cf44b6ef098c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.052807 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" (UID: "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.053085 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" (UID: "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.055743 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-kube-api-access-ppjqs" (OuterVolumeSpecName: "kube-api-access-ppjqs") pod "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" (UID: "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe"). InnerVolumeSpecName "kube-api-access-ppjqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.057933 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-scripts" (OuterVolumeSpecName: "scripts") pod "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" (UID: "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.088811 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" (UID: "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.156872 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.157134 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppjqs\" (UniqueName: \"kubernetes.io/projected/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-kube-api-access-ppjqs\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.157144 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.157154 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.157165 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.163892 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe","Type":"ContainerDied","Data":"7f6290c1d51a607b6c0b982714cd1e71d91f192bd0c580ecd604fbb755b5fb0e"} Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.163940 4962 scope.go:117] "RemoveContainer" containerID="3d886faa85a7dc9131c92ab0f655a7565cf4b5da93977cedbba1f4a02da19850" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.163984 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-config-data" (OuterVolumeSpecName: "config-data") pod "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" (UID: "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.163976 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.166281 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"75210a15-c36f-4be9-9709-ceb4eb2c4646","Type":"ContainerStarted","Data":"03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9"} Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.169225 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kz6b9" event={"ID":"866c8e6b-3fdd-442c-98d4-cf44b6ef098c","Type":"ContainerDied","Data":"3950de5905a4dc7a8e1191cd18d0ca1f2417da16ba610af0e92e93c717cf82f4"} Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.169253 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3950de5905a4dc7a8e1191cd18d0ca1f2417da16ba610af0e92e93c717cf82f4" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.169304 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kz6b9" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.171078 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" (UID: "b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.176279 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b6c975-4cb8j" event={"ID":"05f2e935-e9b5-49ab-8a2a-30b15840bae9","Type":"ContainerStarted","Data":"036194e68dbb515945afa3ad089ad8f4474610c770e29c2e3ac03647eae66d7d"} Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.176323 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b6c975-4cb8j" event={"ID":"05f2e935-e9b5-49ab-8a2a-30b15840bae9","Type":"ContainerStarted","Data":"fde2842fbb361eeef96c1f37f5ca7accd441dd0d4530fb4fc8d6c4be4392db6f"} Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.176360 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b6c975-4cb8j" event={"ID":"05f2e935-e9b5-49ab-8a2a-30b15840bae9","Type":"ContainerStarted","Data":"cac01411963bfa7be3e10db04acc0c3eea2c2ef096b36b436336abeb75d82df1"} Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.177490 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.177679 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.196870 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.780951188 podStartE2EDuration="12.196847449s" podCreationTimestamp="2025-10-03 13:13:37 +0000 UTC" firstStartedPulling="2025-10-03 13:13:38.300808611 +0000 UTC m=+1426.704706446" lastFinishedPulling="2025-10-03 13:13:48.716704862 +0000 UTC m=+1437.120602707" observedRunningTime="2025-10-03 13:13:49.182609096 +0000 UTC m=+1437.586506931" watchObservedRunningTime="2025-10-03 13:13:49.196847449 +0000 UTC m=+1437.600745284" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.197822 4962 scope.go:117] "RemoveContainer" containerID="8e2690e91a3f7bc521a41ae68cf943b94cdba77402b2a1f81df19b4b2e4fc75d" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.218092 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-68b6c975-4cb8j" podStartSLOduration=7.218069269 podStartE2EDuration="7.218069269s" podCreationTimestamp="2025-10-03 13:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:49.20209264 +0000 UTC m=+1437.605990475" watchObservedRunningTime="2025-10-03 13:13:49.218069269 +0000 UTC m=+1437.621967104" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.225563 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9lwfl"] Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.232080 4962 scope.go:117] "RemoveContainer" containerID="675cb5bcbe617216f33c8ef6b007fb7933f4700a66e18156cf26a7b0e4a70ab1" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.256599 4962 scope.go:117] "RemoveContainer" containerID="3d8e38eef106a8fcdec0389cc2f4baa1a812949d416fb48c33991e1d4e85999d" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.258849 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.258866 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.323704 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7xjlw"] Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.425673 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-djwxp"] Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.666654 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.691913 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714158 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:49 crc kubenswrapper[4962]: E1003 13:13:49.714555 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="ceilometer-notification-agent" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714575 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="ceilometer-notification-agent" Oct 03 13:13:49 crc kubenswrapper[4962]: E1003 13:13:49.714585 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="sg-core" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714591 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="sg-core" Oct 03 13:13:49 crc kubenswrapper[4962]: E1003 13:13:49.714611 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866c8e6b-3fdd-442c-98d4-cf44b6ef098c" containerName="cinder-db-sync" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714617 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="866c8e6b-3fdd-442c-98d4-cf44b6ef098c" containerName="cinder-db-sync" Oct 03 13:13:49 crc kubenswrapper[4962]: E1003 13:13:49.714635 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="proxy-httpd" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714661 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="proxy-httpd" Oct 03 13:13:49 crc kubenswrapper[4962]: E1003 13:13:49.714672 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="ceilometer-central-agent" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714717 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="ceilometer-central-agent" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714914 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="866c8e6b-3fdd-442c-98d4-cf44b6ef098c" containerName="cinder-db-sync" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714950 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="sg-core" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714958 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="proxy-httpd" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714970 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="ceilometer-notification-agent" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.714987 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" containerName="ceilometer-central-agent" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.716618 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.720477 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.720716 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.730888 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.869079 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-log-httpd\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.869121 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.869152 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-scripts\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.869169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-config-data\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.869188 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tknk\" (UniqueName: \"kubernetes.io/projected/0dea3c18-6654-4c16-b382-579ac81adcda-kube-api-access-2tknk\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.869208 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-run-httpd\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.869239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.971199 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-log-httpd\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.971251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.971283 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-scripts\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.971302 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-config-data\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.971328 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tknk\" (UniqueName: \"kubernetes.io/projected/0dea3c18-6654-4c16-b382-579ac81adcda-kube-api-access-2tknk\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.971358 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-run-httpd\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.971405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.973027 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-run-httpd\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.973518 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-log-httpd\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.985064 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-config-data\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.985189 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-scripts\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.985689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:49 crc kubenswrapper[4962]: I1003 13:13:49.987488 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.006959 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tknk\" (UniqueName: \"kubernetes.io/projected/0dea3c18-6654-4c16-b382-579ac81adcda-kube-api-access-2tknk\") pod \"ceilometer-0\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " pod="openstack/ceilometer-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.034062 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.079278 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.089860 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.095608 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.095821 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.095936 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.096044 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7m46v" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.107030 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.138305 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tgc99"] Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.139940 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.162106 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tgc99"] Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.176317 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.176743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.176981 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.177096 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c291303-c69e-44f7-9afa-15d3964eff4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.177239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.177364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zndn\" (UniqueName: \"kubernetes.io/projected/1c291303-c69e-44f7-9afa-15d3964eff4d-kube-api-access-9zndn\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.230478 4962 generic.go:334] "Generic (PLEG): container finished" podID="164dbb03-b802-46d3-8dbd-1b6dc90cda51" containerID="cd77f692ad16b6df80e87bb64d27bf54a83e55dc2e95cca7ac8bfb5c3f2c63a2" exitCode=0 Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.233413 4962 generic.go:334] "Generic (PLEG): container finished" podID="0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b" containerID="0f13089470a687f3441879430ff97fb0147186cfadd8f1e2e6b1b3fdfab1347e" exitCode=0 Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.236311 4962 generic.go:334] "Generic (PLEG): container finished" podID="61cfe20f-97f2-444b-9a56-00a3c22d7ba7" containerID="cb23b1576afecd7bb943264b193f4118ca9e14e0fda87415eabb3843eaf944f9" exitCode=0 Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.254227 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe" path="/var/lib/kubelet/pods/b8ad1916-3bcc-4eee-83ad-5fdc7d6e2bbe/volumes" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.254922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7xjlw" event={"ID":"164dbb03-b802-46d3-8dbd-1b6dc90cda51","Type":"ContainerDied","Data":"cd77f692ad16b6df80e87bb64d27bf54a83e55dc2e95cca7ac8bfb5c3f2c63a2"} Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.254947 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7xjlw" event={"ID":"164dbb03-b802-46d3-8dbd-1b6dc90cda51","Type":"ContainerStarted","Data":"cf325ece55fc28b2eda6770270e5380b70e82f877ba86383d6027b12adf2e0bf"} Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.254959 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9lwfl" event={"ID":"0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b","Type":"ContainerDied","Data":"0f13089470a687f3441879430ff97fb0147186cfadd8f1e2e6b1b3fdfab1347e"} Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.254970 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9lwfl" event={"ID":"0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b","Type":"ContainerStarted","Data":"a8acd362948332928f058c259f40284e8f08ca1173fd5dc3c34602750d0dd860"} Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.254979 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-djwxp" event={"ID":"61cfe20f-97f2-444b-9a56-00a3c22d7ba7","Type":"ContainerDied","Data":"cb23b1576afecd7bb943264b193f4118ca9e14e0fda87415eabb3843eaf944f9"} Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.254990 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-djwxp" event={"ID":"61cfe20f-97f2-444b-9a56-00a3c22d7ba7","Type":"ContainerStarted","Data":"8a4250d23794ca01ae46f5ed410f3e9f5f463a3209ebb633c08833b7adc799cd"} Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.279741 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.279939 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7p2n\" (UniqueName: \"kubernetes.io/projected/143264d1-7bcc-47d8-aa73-047017954ff4-kube-api-access-f7p2n\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.280361 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c291303-c69e-44f7-9afa-15d3964eff4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.280451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.280532 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-config\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.280606 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.280714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.280853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zndn\" (UniqueName: \"kubernetes.io/projected/1c291303-c69e-44f7-9afa-15d3964eff4d-kube-api-access-9zndn\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.280929 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.281027 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.281369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.281477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.288459 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c291303-c69e-44f7-9afa-15d3964eff4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.291075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.291572 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.298202 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.315316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.323562 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zndn\" (UniqueName: \"kubernetes.io/projected/1c291303-c69e-44f7-9afa-15d3964eff4d-kube-api-access-9zndn\") pod \"cinder-scheduler-0\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.351371 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.352872 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.357161 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.363744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.382952 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7p2n\" (UniqueName: \"kubernetes.io/projected/143264d1-7bcc-47d8-aa73-047017954ff4-kube-api-access-f7p2n\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.383049 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.383084 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-config\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.383112 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.383310 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.383344 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.384651 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.387538 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.387967 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-config\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.388777 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.389017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.404745 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7p2n\" (UniqueName: \"kubernetes.io/projected/143264d1-7bcc-47d8-aa73-047017954ff4-kube-api-access-f7p2n\") pod \"dnsmasq-dns-5784cf869f-tgc99\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.485160 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-scripts\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.485226 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2626e82-5179-406f-b682-f3ae965e1bae-logs\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.485322 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.485539 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.485691 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzvmt\" (UniqueName: \"kubernetes.io/projected/e2626e82-5179-406f-b682-f3ae965e1bae-kube-api-access-tzvmt\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.485769 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2626e82-5179-406f-b682-f3ae965e1bae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.485940 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.586988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.587060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.587122 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzvmt\" (UniqueName: \"kubernetes.io/projected/e2626e82-5179-406f-b682-f3ae965e1bae-kube-api-access-tzvmt\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.587161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2626e82-5179-406f-b682-f3ae965e1bae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.587221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.587252 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-scripts\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.587273 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2626e82-5179-406f-b682-f3ae965e1bae-logs\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.587630 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2626e82-5179-406f-b682-f3ae965e1bae-logs\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.588244 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.588860 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2626e82-5179-406f-b682-f3ae965e1bae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.589690 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.595505 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.596702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.602327 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.618418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-scripts\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.677250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzvmt\" (UniqueName: \"kubernetes.io/projected/e2626e82-5179-406f-b682-f3ae965e1bae-kube-api-access-tzvmt\") pod \"cinder-api-0\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.703920 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 13:13:50 crc kubenswrapper[4962]: I1003 13:13:50.754978 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:51 crc kubenswrapper[4962]: I1003 13:13:51.251963 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerStarted","Data":"cf9329f89ee3a3ebe4fb670d8ea694eee50688c5af85afc136588c635be72d65"} Oct 03 13:13:51 crc kubenswrapper[4962]: I1003 13:13:51.290737 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:13:51 crc kubenswrapper[4962]: W1003 13:13:51.302938 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c291303_c69e_44f7_9afa_15d3964eff4d.slice/crio-761b5cbb574dce3d2778e7a928d3cfe4290c60311663b20dafa944e152065246 WatchSource:0}: Error finding container 761b5cbb574dce3d2778e7a928d3cfe4290c60311663b20dafa944e152065246: Status 404 returned error can't find the container with id 761b5cbb574dce3d2778e7a928d3cfe4290c60311663b20dafa944e152065246 Oct 03 13:13:51 crc kubenswrapper[4962]: I1003 13:13:51.363311 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:51 crc kubenswrapper[4962]: I1003 13:13:51.436426 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tgc99"] Oct 03 13:13:51 crc kubenswrapper[4962]: I1003 13:13:51.845787 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:51 crc kubenswrapper[4962]: I1003 13:13:51.922663 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7xjlw" Oct 03 13:13:51 crc kubenswrapper[4962]: I1003 13:13:51.997555 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-djwxp" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.002116 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9lwfl" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.032531 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnqgs\" (UniqueName: \"kubernetes.io/projected/164dbb03-b802-46d3-8dbd-1b6dc90cda51-kube-api-access-cnqgs\") pod \"164dbb03-b802-46d3-8dbd-1b6dc90cda51\" (UID: \"164dbb03-b802-46d3-8dbd-1b6dc90cda51\") " Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.068909 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164dbb03-b802-46d3-8dbd-1b6dc90cda51-kube-api-access-cnqgs" (OuterVolumeSpecName: "kube-api-access-cnqgs") pod "164dbb03-b802-46d3-8dbd-1b6dc90cda51" (UID: "164dbb03-b802-46d3-8dbd-1b6dc90cda51"). InnerVolumeSpecName "kube-api-access-cnqgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.134260 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj4lf\" (UniqueName: \"kubernetes.io/projected/61cfe20f-97f2-444b-9a56-00a3c22d7ba7-kube-api-access-nj4lf\") pod \"61cfe20f-97f2-444b-9a56-00a3c22d7ba7\" (UID: \"61cfe20f-97f2-444b-9a56-00a3c22d7ba7\") " Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.134506 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxzhb\" (UniqueName: \"kubernetes.io/projected/0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b-kube-api-access-vxzhb\") pod \"0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b\" (UID: \"0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b\") " Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.134891 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnqgs\" (UniqueName: \"kubernetes.io/projected/164dbb03-b802-46d3-8dbd-1b6dc90cda51-kube-api-access-cnqgs\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.139513 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b-kube-api-access-vxzhb" (OuterVolumeSpecName: "kube-api-access-vxzhb") pod "0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b" (UID: "0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b"). InnerVolumeSpecName "kube-api-access-vxzhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.140358 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61cfe20f-97f2-444b-9a56-00a3c22d7ba7-kube-api-access-nj4lf" (OuterVolumeSpecName: "kube-api-access-nj4lf") pod "61cfe20f-97f2-444b-9a56-00a3c22d7ba7" (UID: "61cfe20f-97f2-444b-9a56-00a3c22d7ba7"). InnerVolumeSpecName "kube-api-access-nj4lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.236020 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxzhb\" (UniqueName: \"kubernetes.io/projected/0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b-kube-api-access-vxzhb\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.236261 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj4lf\" (UniqueName: \"kubernetes.io/projected/61cfe20f-97f2-444b-9a56-00a3c22d7ba7-kube-api-access-nj4lf\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.291758 4962 generic.go:334] "Generic (PLEG): container finished" podID="143264d1-7bcc-47d8-aa73-047017954ff4" containerID="183854e4e4df735b62ad5a86e902e0394b30db0b0f4ce13143287ec7a88d4fca" exitCode=0 Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.291871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" event={"ID":"143264d1-7bcc-47d8-aa73-047017954ff4","Type":"ContainerDied","Data":"183854e4e4df735b62ad5a86e902e0394b30db0b0f4ce13143287ec7a88d4fca"} Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.291951 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" event={"ID":"143264d1-7bcc-47d8-aa73-047017954ff4","Type":"ContainerStarted","Data":"da873234757532eac321f4323009fc887086ba08c6dca1ac0f713be079b2c000"} Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.322280 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9lwfl" event={"ID":"0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b","Type":"ContainerDied","Data":"a8acd362948332928f058c259f40284e8f08ca1173fd5dc3c34602750d0dd860"} Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.322338 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8acd362948332928f058c259f40284e8f08ca1173fd5dc3c34602750d0dd860" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.322438 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9lwfl" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.325941 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c291303-c69e-44f7-9afa-15d3964eff4d","Type":"ContainerStarted","Data":"761b5cbb574dce3d2778e7a928d3cfe4290c60311663b20dafa944e152065246"} Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.333732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-djwxp" event={"ID":"61cfe20f-97f2-444b-9a56-00a3c22d7ba7","Type":"ContainerDied","Data":"8a4250d23794ca01ae46f5ed410f3e9f5f463a3209ebb633c08833b7adc799cd"} Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.333773 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a4250d23794ca01ae46f5ed410f3e9f5f463a3209ebb633c08833b7adc799cd" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.333846 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-djwxp" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.336550 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2626e82-5179-406f-b682-f3ae965e1bae","Type":"ContainerStarted","Data":"fc29ea97fbc7fbe9c2fb29be6caaae8bb05abd4a0477e903fbae7ab46d8507ce"} Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.356129 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7xjlw" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.356161 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7xjlw" event={"ID":"164dbb03-b802-46d3-8dbd-1b6dc90cda51","Type":"ContainerDied","Data":"cf325ece55fc28b2eda6770270e5380b70e82f877ba86383d6027b12adf2e0bf"} Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.356207 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf325ece55fc28b2eda6770270e5380b70e82f877ba86383d6027b12adf2e0bf" Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.364754 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerStarted","Data":"92e0595c3016c7c09ceab7eebbf189e014011e8f0225ede105f2bcf892456af8"} Oct 03 13:13:52 crc kubenswrapper[4962]: I1003 13:13:52.728589 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:53 crc kubenswrapper[4962]: I1003 13:13:53.378552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2626e82-5179-406f-b682-f3ae965e1bae","Type":"ContainerStarted","Data":"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c"} Oct 03 13:13:53 crc kubenswrapper[4962]: I1003 13:13:53.385338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerStarted","Data":"5cfd9b23fc073c68aa4c0d8b3b242a26954603c4ebd5bf1b670741320cbbf2cd"} Oct 03 13:13:53 crc kubenswrapper[4962]: I1003 13:13:53.394352 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" event={"ID":"143264d1-7bcc-47d8-aa73-047017954ff4","Type":"ContainerStarted","Data":"11088df8f83d28c23c00a477e48a59fc59a6392d7d1dfee3bdd6b3d0b252f17d"} Oct 03 13:13:53 crc kubenswrapper[4962]: I1003 13:13:53.394484 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:13:53 crc kubenswrapper[4962]: I1003 13:13:53.397595 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c291303-c69e-44f7-9afa-15d3964eff4d","Type":"ContainerStarted","Data":"db4cffdea214f8319fe9cef9b03adeb1c68d4302a4bf66714632575f4186097f"} Oct 03 13:13:53 crc kubenswrapper[4962]: I1003 13:13:53.417538 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" podStartSLOduration=3.417523456 podStartE2EDuration="3.417523456s" podCreationTimestamp="2025-10-03 13:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:53.414259949 +0000 UTC m=+1441.818157794" watchObservedRunningTime="2025-10-03 13:13:53.417523456 +0000 UTC m=+1441.821421291" Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.410357 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerStarted","Data":"49f365b73efa9e81fce808106811dbada9e196b48b503c49d595857e43499892"} Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.412387 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c291303-c69e-44f7-9afa-15d3964eff4d","Type":"ContainerStarted","Data":"51e10bccf5b133b42d415f9798ef5913e91b1aa2310fe64e85b079770a169a54"} Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.415192 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2626e82-5179-406f-b682-f3ae965e1bae","Type":"ContainerStarted","Data":"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191"} Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.415304 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" containerName="cinder-api-log" containerID="cri-o://4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c" gracePeriod=30 Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.415334 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.415363 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" containerName="cinder-api" containerID="cri-o://f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191" gracePeriod=30 Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.437870 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.557524037 podStartE2EDuration="4.437850092s" podCreationTimestamp="2025-10-03 13:13:50 +0000 UTC" firstStartedPulling="2025-10-03 13:13:51.315479186 +0000 UTC m=+1439.719377021" lastFinishedPulling="2025-10-03 13:13:52.195805241 +0000 UTC m=+1440.599703076" observedRunningTime="2025-10-03 13:13:54.429821517 +0000 UTC m=+1442.833719352" watchObservedRunningTime="2025-10-03 13:13:54.437850092 +0000 UTC m=+1442.841747937" Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.450631 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.450614215 podStartE2EDuration="4.450614215s" podCreationTimestamp="2025-10-03 13:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:54.446881505 +0000 UTC m=+1442.850779340" watchObservedRunningTime="2025-10-03 13:13:54.450614215 +0000 UTC m=+1442.854512050" Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.662704 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:13:54 crc kubenswrapper[4962]: I1003 13:13:54.663036 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.106244 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.205555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-scripts\") pod \"e2626e82-5179-406f-b682-f3ae965e1bae\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.205603 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2626e82-5179-406f-b682-f3ae965e1bae-logs\") pod \"e2626e82-5179-406f-b682-f3ae965e1bae\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.205740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data-custom\") pod \"e2626e82-5179-406f-b682-f3ae965e1bae\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.205762 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data\") pod \"e2626e82-5179-406f-b682-f3ae965e1bae\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.205776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-combined-ca-bundle\") pod \"e2626e82-5179-406f-b682-f3ae965e1bae\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.205834 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2626e82-5179-406f-b682-f3ae965e1bae-etc-machine-id\") pod \"e2626e82-5179-406f-b682-f3ae965e1bae\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.205883 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzvmt\" (UniqueName: \"kubernetes.io/projected/e2626e82-5179-406f-b682-f3ae965e1bae-kube-api-access-tzvmt\") pod \"e2626e82-5179-406f-b682-f3ae965e1bae\" (UID: \"e2626e82-5179-406f-b682-f3ae965e1bae\") " Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.206310 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2626e82-5179-406f-b682-f3ae965e1bae-logs" (OuterVolumeSpecName: "logs") pod "e2626e82-5179-406f-b682-f3ae965e1bae" (UID: "e2626e82-5179-406f-b682-f3ae965e1bae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.206381 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2626e82-5179-406f-b682-f3ae965e1bae-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e2626e82-5179-406f-b682-f3ae965e1bae" (UID: "e2626e82-5179-406f-b682-f3ae965e1bae"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.210142 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2626e82-5179-406f-b682-f3ae965e1bae-kube-api-access-tzvmt" (OuterVolumeSpecName: "kube-api-access-tzvmt") pod "e2626e82-5179-406f-b682-f3ae965e1bae" (UID: "e2626e82-5179-406f-b682-f3ae965e1bae"). InnerVolumeSpecName "kube-api-access-tzvmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.210156 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e2626e82-5179-406f-b682-f3ae965e1bae" (UID: "e2626e82-5179-406f-b682-f3ae965e1bae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.231992 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-scripts" (OuterVolumeSpecName: "scripts") pod "e2626e82-5179-406f-b682-f3ae965e1bae" (UID: "e2626e82-5179-406f-b682-f3ae965e1bae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.241531 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2626e82-5179-406f-b682-f3ae965e1bae" (UID: "e2626e82-5179-406f-b682-f3ae965e1bae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.257502 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data" (OuterVolumeSpecName: "config-data") pod "e2626e82-5179-406f-b682-f3ae965e1bae" (UID: "e2626e82-5179-406f-b682-f3ae965e1bae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.308792 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.308821 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.308833 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.308844 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2626e82-5179-406f-b682-f3ae965e1bae-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.308854 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzvmt\" (UniqueName: \"kubernetes.io/projected/e2626e82-5179-406f-b682-f3ae965e1bae-kube-api-access-tzvmt\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.308863 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2626e82-5179-406f-b682-f3ae965e1bae-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.308873 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2626e82-5179-406f-b682-f3ae965e1bae-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.425724 4962 generic.go:334] "Generic (PLEG): container finished" podID="e2626e82-5179-406f-b682-f3ae965e1bae" containerID="f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191" exitCode=0 Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.425753 4962 generic.go:334] "Generic (PLEG): container finished" podID="e2626e82-5179-406f-b682-f3ae965e1bae" containerID="4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c" exitCode=143 Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.425770 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.425773 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2626e82-5179-406f-b682-f3ae965e1bae","Type":"ContainerDied","Data":"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191"} Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.425831 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2626e82-5179-406f-b682-f3ae965e1bae","Type":"ContainerDied","Data":"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c"} Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.425851 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2626e82-5179-406f-b682-f3ae965e1bae","Type":"ContainerDied","Data":"fc29ea97fbc7fbe9c2fb29be6caaae8bb05abd4a0477e903fbae7ab46d8507ce"} Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.425880 4962 scope.go:117] "RemoveContainer" containerID="f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.428589 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerStarted","Data":"aa11f7bd2821a9bf939114394718af9133b67177857cabf683e8b5433d3c0e77"} Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.428799 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="ceilometer-central-agent" containerID="cri-o://92e0595c3016c7c09ceab7eebbf189e014011e8f0225ede105f2bcf892456af8" gracePeriod=30 Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.428847 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="sg-core" containerID="cri-o://49f365b73efa9e81fce808106811dbada9e196b48b503c49d595857e43499892" gracePeriod=30 Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.428911 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="ceilometer-notification-agent" containerID="cri-o://5cfd9b23fc073c68aa4c0d8b3b242a26954603c4ebd5bf1b670741320cbbf2cd" gracePeriod=30 Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.428933 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="proxy-httpd" containerID="cri-o://aa11f7bd2821a9bf939114394718af9133b67177857cabf683e8b5433d3c0e77" gracePeriod=30 Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.428819 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.478038 4962 scope.go:117] "RemoveContainer" containerID="4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.487928 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.697291572 podStartE2EDuration="6.487907667s" podCreationTimestamp="2025-10-03 13:13:49 +0000 UTC" firstStartedPulling="2025-10-03 13:13:50.809552197 +0000 UTC m=+1439.213450022" lastFinishedPulling="2025-10-03 13:13:54.600168282 +0000 UTC m=+1443.004066117" observedRunningTime="2025-10-03 13:13:55.475501314 +0000 UTC m=+1443.879399149" watchObservedRunningTime="2025-10-03 13:13:55.487907667 +0000 UTC m=+1443.891805522" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.498734 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.505078 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.515818 4962 scope.go:117] "RemoveContainer" containerID="f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191" Oct 03 13:13:55 crc kubenswrapper[4962]: E1003 13:13:55.516337 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191\": container with ID starting with f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191 not found: ID does not exist" containerID="f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.516381 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191"} err="failed to get container status \"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191\": rpc error: code = NotFound desc = could not find container \"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191\": container with ID starting with f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191 not found: ID does not exist" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.516407 4962 scope.go:117] "RemoveContainer" containerID="4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.520161 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:55 crc kubenswrapper[4962]: E1003 13:13:55.520577 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" containerName="cinder-api-log" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.520969 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" containerName="cinder-api-log" Oct 03 13:13:55 crc kubenswrapper[4962]: E1003 13:13:55.521007 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164dbb03-b802-46d3-8dbd-1b6dc90cda51" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521019 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="164dbb03-b802-46d3-8dbd-1b6dc90cda51" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: E1003 13:13:55.521030 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" containerName="cinder-api" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521038 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" containerName="cinder-api" Oct 03 13:13:55 crc kubenswrapper[4962]: E1003 13:13:55.521070 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61cfe20f-97f2-444b-9a56-00a3c22d7ba7" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521078 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="61cfe20f-97f2-444b-9a56-00a3c22d7ba7" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: E1003 13:13:55.521093 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521102 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521335 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" containerName="cinder-api" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521347 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="164dbb03-b802-46d3-8dbd-1b6dc90cda51" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521367 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" containerName="cinder-api-log" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521392 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="61cfe20f-97f2-444b-9a56-00a3c22d7ba7" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521405 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b" containerName="mariadb-database-create" Oct 03 13:13:55 crc kubenswrapper[4962]: E1003 13:13:55.521352 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c\": container with ID starting with 4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c not found: ID does not exist" containerID="4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521504 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c"} err="failed to get container status \"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c\": rpc error: code = NotFound desc = could not find container \"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c\": container with ID starting with 4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c not found: ID does not exist" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521541 4962 scope.go:117] "RemoveContainer" containerID="f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521887 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191"} err="failed to get container status \"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191\": rpc error: code = NotFound desc = could not find container \"f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191\": container with ID starting with f71ed4fa8461231119d8a0bf171f9fde881adff37e3844b1413c372432e9b191 not found: ID does not exist" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.521913 4962 scope.go:117] "RemoveContainer" containerID="4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.522655 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.523455 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c"} err="failed to get container status \"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c\": rpc error: code = NotFound desc = could not find container \"4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c\": container with ID starting with 4bc327532ccafa3392733bb50a5055f8a11ac39b9c37f7646b23a4c708e78d9c not found: ID does not exist" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.526729 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.526848 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.526941 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.534468 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.589606 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstsr\" (UniqueName: \"kubernetes.io/projected/6ae29e17-1d99-4401-a317-9c8b7be58a3c-kube-api-access-sstsr\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615491 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-scripts\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615511 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615533 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ae29e17-1d99-4401-a317-9c8b7be58a3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615555 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.615592 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ae29e17-1d99-4401-a317-9c8b7be58a3c-logs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.716782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstsr\" (UniqueName: \"kubernetes.io/projected/6ae29e17-1d99-4401-a317-9c8b7be58a3c-kube-api-access-sstsr\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.716834 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.716903 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-scripts\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.716929 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.716956 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ae29e17-1d99-4401-a317-9c8b7be58a3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.716979 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.717011 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ae29e17-1d99-4401-a317-9c8b7be58a3c-logs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.717053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.717069 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.719494 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ae29e17-1d99-4401-a317-9c8b7be58a3c-logs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.719869 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ae29e17-1d99-4401-a317-9c8b7be58a3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.724384 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.724458 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.725313 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.727908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.729227 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.731077 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-scripts\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.733244 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstsr\" (UniqueName: \"kubernetes.io/projected/6ae29e17-1d99-4401-a317-9c8b7be58a3c-kube-api-access-sstsr\") pod \"cinder-api-0\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " pod="openstack/cinder-api-0" Oct 03 13:13:55 crc kubenswrapper[4962]: I1003 13:13:55.858339 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.237613 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2626e82-5179-406f-b682-f3ae965e1bae" path="/var/lib/kubelet/pods/e2626e82-5179-406f-b682-f3ae965e1bae/volumes" Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.319543 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.440065 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6ae29e17-1d99-4401-a317-9c8b7be58a3c","Type":"ContainerStarted","Data":"5127db17298303c9fb33fa13540988e6f53ff4b60ecb93d3188ebd7aa931eadc"} Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.444133 4962 generic.go:334] "Generic (PLEG): container finished" podID="0dea3c18-6654-4c16-b382-579ac81adcda" containerID="aa11f7bd2821a9bf939114394718af9133b67177857cabf683e8b5433d3c0e77" exitCode=0 Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.444165 4962 generic.go:334] "Generic (PLEG): container finished" podID="0dea3c18-6654-4c16-b382-579ac81adcda" containerID="49f365b73efa9e81fce808106811dbada9e196b48b503c49d595857e43499892" exitCode=2 Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.444174 4962 generic.go:334] "Generic (PLEG): container finished" podID="0dea3c18-6654-4c16-b382-579ac81adcda" containerID="5cfd9b23fc073c68aa4c0d8b3b242a26954603c4ebd5bf1b670741320cbbf2cd" exitCode=0 Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.444207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerDied","Data":"aa11f7bd2821a9bf939114394718af9133b67177857cabf683e8b5433d3c0e77"} Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.444256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerDied","Data":"49f365b73efa9e81fce808106811dbada9e196b48b503c49d595857e43499892"} Oct 03 13:13:56 crc kubenswrapper[4962]: I1003 13:13:56.444268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerDied","Data":"5cfd9b23fc073c68aa4c0d8b3b242a26954603c4ebd5bf1b670741320cbbf2cd"} Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.326175 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.481983 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6ae29e17-1d99-4401-a317-9c8b7be58a3c","Type":"ContainerStarted","Data":"1ffd9ed0756445b7f118da7e647ddacf67d26cbedb3b89a2b3074c6bedfe80b2"} Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.499827 4962 generic.go:334] "Generic (PLEG): container finished" podID="0dea3c18-6654-4c16-b382-579ac81adcda" containerID="92e0595c3016c7c09ceab7eebbf189e014011e8f0225ede105f2bcf892456af8" exitCode=0 Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.499864 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerDied","Data":"92e0595c3016c7c09ceab7eebbf189e014011e8f0225ede105f2bcf892456af8"} Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.595274 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.653885 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-scripts\") pod \"0dea3c18-6654-4c16-b382-579ac81adcda\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.653945 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-run-httpd\") pod \"0dea3c18-6654-4c16-b382-579ac81adcda\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.653988 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-config-data\") pod \"0dea3c18-6654-4c16-b382-579ac81adcda\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.654079 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tknk\" (UniqueName: \"kubernetes.io/projected/0dea3c18-6654-4c16-b382-579ac81adcda-kube-api-access-2tknk\") pod \"0dea3c18-6654-4c16-b382-579ac81adcda\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.654160 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-combined-ca-bundle\") pod \"0dea3c18-6654-4c16-b382-579ac81adcda\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.654209 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-log-httpd\") pod \"0dea3c18-6654-4c16-b382-579ac81adcda\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.654297 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-sg-core-conf-yaml\") pod \"0dea3c18-6654-4c16-b382-579ac81adcda\" (UID: \"0dea3c18-6654-4c16-b382-579ac81adcda\") " Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.658958 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0dea3c18-6654-4c16-b382-579ac81adcda" (UID: "0dea3c18-6654-4c16-b382-579ac81adcda"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.664128 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0dea3c18-6654-4c16-b382-579ac81adcda" (UID: "0dea3c18-6654-4c16-b382-579ac81adcda"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.666132 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-scripts" (OuterVolumeSpecName: "scripts") pod "0dea3c18-6654-4c16-b382-579ac81adcda" (UID: "0dea3c18-6654-4c16-b382-579ac81adcda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.669175 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dea3c18-6654-4c16-b382-579ac81adcda-kube-api-access-2tknk" (OuterVolumeSpecName: "kube-api-access-2tknk") pod "0dea3c18-6654-4c16-b382-579ac81adcda" (UID: "0dea3c18-6654-4c16-b382-579ac81adcda"). InnerVolumeSpecName "kube-api-access-2tknk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.693174 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0dea3c18-6654-4c16-b382-579ac81adcda" (UID: "0dea3c18-6654-4c16-b382-579ac81adcda"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.756414 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.756453 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.756462 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tknk\" (UniqueName: \"kubernetes.io/projected/0dea3c18-6654-4c16-b382-579ac81adcda-kube-api-access-2tknk\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.756475 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dea3c18-6654-4c16-b382-579ac81adcda-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.756484 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.756722 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dea3c18-6654-4c16-b382-579ac81adcda" (UID: "0dea3c18-6654-4c16-b382-579ac81adcda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.794471 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-config-data" (OuterVolumeSpecName: "config-data") pod "0dea3c18-6654-4c16-b382-579ac81adcda" (UID: "0dea3c18-6654-4c16-b382-579ac81adcda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.857774 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.858075 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dea3c18-6654-4c16-b382-579ac81adcda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.927226 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:57 crc kubenswrapper[4962]: I1003 13:13:57.928423 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.509892 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dea3c18-6654-4c16-b382-579ac81adcda","Type":"ContainerDied","Data":"cf9329f89ee3a3ebe4fb670d8ea694eee50688c5af85afc136588c635be72d65"} Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.509946 4962 scope.go:117] "RemoveContainer" containerID="aa11f7bd2821a9bf939114394718af9133b67177857cabf683e8b5433d3c0e77" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.510085 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.517676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6ae29e17-1d99-4401-a317-9c8b7be58a3c","Type":"ContainerStarted","Data":"5338fa6e89e4bd15f9e4db940f00a7a5b8fdf7c12ab81b1de2fe6747f81ea20d"} Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.517807 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.549067 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.553618 4962 scope.go:117] "RemoveContainer" containerID="49f365b73efa9e81fce808106811dbada9e196b48b503c49d595857e43499892" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.568719 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.578493 4962 scope.go:117] "RemoveContainer" containerID="5cfd9b23fc073c68aa4c0d8b3b242a26954603c4ebd5bf1b670741320cbbf2cd" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.596212 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:58 crc kubenswrapper[4962]: E1003 13:13:58.596715 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="proxy-httpd" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.596735 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="proxy-httpd" Oct 03 13:13:58 crc kubenswrapper[4962]: E1003 13:13:58.596785 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="ceilometer-notification-agent" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.596795 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="ceilometer-notification-agent" Oct 03 13:13:58 crc kubenswrapper[4962]: E1003 13:13:58.596821 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="sg-core" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.596827 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="sg-core" Oct 03 13:13:58 crc kubenswrapper[4962]: E1003 13:13:58.596858 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="ceilometer-central-agent" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.596865 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="ceilometer-central-agent" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.597083 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="ceilometer-central-agent" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.597104 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="proxy-httpd" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.597118 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="sg-core" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.597140 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" containerName="ceilometer-notification-agent" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.598836 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.598816146 podStartE2EDuration="3.598816146s" podCreationTimestamp="2025-10-03 13:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:13:58.560523667 +0000 UTC m=+1446.964421502" watchObservedRunningTime="2025-10-03 13:13:58.598816146 +0000 UTC m=+1447.002713991" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.598976 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.600997 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.606043 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.611462 4962 scope.go:117] "RemoveContainer" containerID="92e0595c3016c7c09ceab7eebbf189e014011e8f0225ede105f2bcf892456af8" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.614855 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.672699 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-config-data\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.672765 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xxh9\" (UniqueName: \"kubernetes.io/projected/7d947c9f-2cbc-428e-9464-ff2b560fe91c-kube-api-access-8xxh9\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.672790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.672823 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.672966 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-run-httpd\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.673059 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-scripts\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.673093 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-log-httpd\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.775515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-config-data\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.775574 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xxh9\" (UniqueName: \"kubernetes.io/projected/7d947c9f-2cbc-428e-9464-ff2b560fe91c-kube-api-access-8xxh9\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.775596 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.775623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.775681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-run-httpd\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.775700 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-scripts\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.775716 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-log-httpd\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.776192 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-log-httpd\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.777146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-run-httpd\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.781328 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.786435 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-config-data\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.794259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-scripts\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.805070 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.813277 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xxh9\" (UniqueName: \"kubernetes.io/projected/7d947c9f-2cbc-428e-9464-ff2b560fe91c-kube-api-access-8xxh9\") pod \"ceilometer-0\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " pod="openstack/ceilometer-0" Oct 03 13:13:58 crc kubenswrapper[4962]: I1003 13:13:58.914701 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:13:59 crc kubenswrapper[4962]: I1003 13:13:59.333542 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:13:59 crc kubenswrapper[4962]: W1003 13:13:59.341853 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d947c9f_2cbc_428e_9464_ff2b560fe91c.slice/crio-7d89596962906130eb75f2a94a53f91944508ebcfc318883bd8a1b020fae74f9 WatchSource:0}: Error finding container 7d89596962906130eb75f2a94a53f91944508ebcfc318883bd8a1b020fae74f9: Status 404 returned error can't find the container with id 7d89596962906130eb75f2a94a53f91944508ebcfc318883bd8a1b020fae74f9 Oct 03 13:13:59 crc kubenswrapper[4962]: I1003 13:13:59.529364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerStarted","Data":"7d89596962906130eb75f2a94a53f91944508ebcfc318883bd8a1b020fae74f9"} Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.237294 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dea3c18-6654-4c16-b382-579ac81adcda" path="/var/lib/kubelet/pods/0dea3c18-6654-4c16-b382-579ac81adcda/volumes" Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.445349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.517966 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-776f46cfdd-zh4jf"] Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.520343 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-776f46cfdd-zh4jf" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerName="neutron-api" containerID="cri-o://f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60" gracePeriod=30 Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.520418 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-776f46cfdd-zh4jf" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerName="neutron-httpd" containerID="cri-o://8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65" gracePeriod=30 Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.591757 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.668075 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-7bckc"] Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.668476 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" podUID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" containerName="dnsmasq-dns" containerID="cri-o://f749546d6477023e3f3a7cd7784bffbbe59ed5dc3858698a822779923a8a22eb" gracePeriod=10 Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.876481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 13:14:00 crc kubenswrapper[4962]: I1003 13:14:00.923534 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.562442 4962 generic.go:334] "Generic (PLEG): container finished" podID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" containerID="f749546d6477023e3f3a7cd7784bffbbe59ed5dc3858698a822779923a8a22eb" exitCode=0 Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.562501 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" event={"ID":"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3","Type":"ContainerDied","Data":"f749546d6477023e3f3a7cd7784bffbbe59ed5dc3858698a822779923a8a22eb"} Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.571045 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerID="8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65" exitCode=0 Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.571344 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776f46cfdd-zh4jf" event={"ID":"7f8103ea-25b6-4a47-bb63-ac20d8fef59f","Type":"ContainerDied","Data":"8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65"} Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.571453 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerName="cinder-scheduler" containerID="cri-o://db4cffdea214f8319fe9cef9b03adeb1c68d4302a4bf66714632575f4186097f" gracePeriod=30 Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.571514 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerName="probe" containerID="cri-o://51e10bccf5b133b42d415f9798ef5913e91b1aa2310fe64e85b079770a169a54" gracePeriod=30 Oct 03 13:14:01 crc kubenswrapper[4962]: E1003 13:14:01.582386 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9e6c89_714e_4efd_9adc_c15cd5b3eb6b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.754507 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.854503 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkm2b\" (UniqueName: \"kubernetes.io/projected/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-kube-api-access-tkm2b\") pod \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.854602 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-svc\") pod \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.854722 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-nb\") pod \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.854758 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-swift-storage-0\") pod \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.854813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-config\") pod \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.854977 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-sb\") pod \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\" (UID: \"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3\") " Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.879155 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-kube-api-access-tkm2b" (OuterVolumeSpecName: "kube-api-access-tkm2b") pod "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" (UID: "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3"). InnerVolumeSpecName "kube-api-access-tkm2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.900296 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-config" (OuterVolumeSpecName: "config") pod "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" (UID: "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.906443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" (UID: "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.911012 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" (UID: "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.913337 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" (UID: "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.925890 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" (UID: "8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.957106 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.957137 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.957147 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.957156 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.957164 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkm2b\" (UniqueName: \"kubernetes.io/projected/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-kube-api-access-tkm2b\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:01 crc kubenswrapper[4962]: I1003 13:14:01.957173 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.583093 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerStarted","Data":"5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85"} Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.583421 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerStarted","Data":"8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8"} Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.585040 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerID="51e10bccf5b133b42d415f9798ef5913e91b1aa2310fe64e85b079770a169a54" exitCode=0 Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.585103 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c291303-c69e-44f7-9afa-15d3964eff4d","Type":"ContainerDied","Data":"51e10bccf5b133b42d415f9798ef5913e91b1aa2310fe64e85b079770a169a54"} Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.586921 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" event={"ID":"8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3","Type":"ContainerDied","Data":"3a3ed165053e416ce7af646f5dd905837b0044e5735b08ca08878c6b4e5521e3"} Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.586958 4962 scope.go:117] "RemoveContainer" containerID="f749546d6477023e3f3a7cd7784bffbbe59ed5dc3858698a822779923a8a22eb" Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.586971 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-7bckc" Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.610371 4962 scope.go:117] "RemoveContainer" containerID="bb63da13d1a54840396a2eec391baf02ae9def0051aa74c37979642fc4fce534" Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.611625 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-7bckc"] Oct 03 13:14:02 crc kubenswrapper[4962]: I1003 13:14:02.619603 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-7bckc"] Oct 03 13:14:03 crc kubenswrapper[4962]: I1003 13:14:03.495868 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:14:03 crc kubenswrapper[4962]: I1003 13:14:03.496456 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c36ae572-9009-4126-88fa-a27e232e4332" containerName="glance-log" containerID="cri-o://139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9" gracePeriod=30 Oct 03 13:14:03 crc kubenswrapper[4962]: I1003 13:14:03.496825 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c36ae572-9009-4126-88fa-a27e232e4332" containerName="glance-httpd" containerID="cri-o://d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3" gracePeriod=30 Oct 03 13:14:03 crc kubenswrapper[4962]: I1003 13:14:03.598089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerStarted","Data":"4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01"} Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.238692 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" path="/var/lib/kubelet/pods/8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3/volumes" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.250948 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.296959 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-combined-ca-bundle\") pod \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.297013 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-ovndb-tls-certs\") pod \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.297158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-httpd-config\") pod \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.297253 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrd48\" (UniqueName: \"kubernetes.io/projected/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-kube-api-access-mrd48\") pod \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.297277 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-config\") pod \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\" (UID: \"7f8103ea-25b6-4a47-bb63-ac20d8fef59f\") " Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.303700 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7f8103ea-25b6-4a47-bb63-ac20d8fef59f" (UID: "7f8103ea-25b6-4a47-bb63-ac20d8fef59f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.304904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-kube-api-access-mrd48" (OuterVolumeSpecName: "kube-api-access-mrd48") pod "7f8103ea-25b6-4a47-bb63-ac20d8fef59f" (UID: "7f8103ea-25b6-4a47-bb63-ac20d8fef59f"). InnerVolumeSpecName "kube-api-access-mrd48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.354661 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-config" (OuterVolumeSpecName: "config") pod "7f8103ea-25b6-4a47-bb63-ac20d8fef59f" (UID: "7f8103ea-25b6-4a47-bb63-ac20d8fef59f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.358551 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f8103ea-25b6-4a47-bb63-ac20d8fef59f" (UID: "7f8103ea-25b6-4a47-bb63-ac20d8fef59f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.389916 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7f8103ea-25b6-4a47-bb63-ac20d8fef59f" (UID: "7f8103ea-25b6-4a47-bb63-ac20d8fef59f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.399507 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.399534 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrd48\" (UniqueName: \"kubernetes.io/projected/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-kube-api-access-mrd48\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.399546 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.399555 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.399563 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8103ea-25b6-4a47-bb63-ac20d8fef59f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.608561 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerID="db4cffdea214f8319fe9cef9b03adeb1c68d4302a4bf66714632575f4186097f" exitCode=0 Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.608652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c291303-c69e-44f7-9afa-15d3964eff4d","Type":"ContainerDied","Data":"db4cffdea214f8319fe9cef9b03adeb1c68d4302a4bf66714632575f4186097f"} Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.622720 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerID="f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60" exitCode=0 Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.622760 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-776f46cfdd-zh4jf" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.622798 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776f46cfdd-zh4jf" event={"ID":"7f8103ea-25b6-4a47-bb63-ac20d8fef59f","Type":"ContainerDied","Data":"f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60"} Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.622902 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776f46cfdd-zh4jf" event={"ID":"7f8103ea-25b6-4a47-bb63-ac20d8fef59f","Type":"ContainerDied","Data":"0da0ecda038060c509ea4dfa6696447c81bc7b662e1186efa916f5bfa4170ac7"} Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.622925 4962 scope.go:117] "RemoveContainer" containerID="8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.625600 4962 generic.go:334] "Generic (PLEG): container finished" podID="c36ae572-9009-4126-88fa-a27e232e4332" containerID="139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9" exitCode=143 Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.625718 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c36ae572-9009-4126-88fa-a27e232e4332","Type":"ContainerDied","Data":"139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9"} Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.630625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerStarted","Data":"401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b"} Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.631988 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.651445 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8831846730000001 podStartE2EDuration="6.651426038s" podCreationTimestamp="2025-10-03 13:13:58 +0000 UTC" firstStartedPulling="2025-10-03 13:13:59.34435109 +0000 UTC m=+1447.748248925" lastFinishedPulling="2025-10-03 13:14:04.112592455 +0000 UTC m=+1452.516490290" observedRunningTime="2025-10-03 13:14:04.649364953 +0000 UTC m=+1453.053262788" watchObservedRunningTime="2025-10-03 13:14:04.651426038 +0000 UTC m=+1453.055323873" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.666474 4962 scope.go:117] "RemoveContainer" containerID="f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.677785 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-776f46cfdd-zh4jf"] Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.684500 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-776f46cfdd-zh4jf"] Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.685841 4962 scope.go:117] "RemoveContainer" containerID="8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65" Oct 03 13:14:04 crc kubenswrapper[4962]: E1003 13:14:04.686332 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65\": container with ID starting with 8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65 not found: ID does not exist" containerID="8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.686379 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65"} err="failed to get container status \"8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65\": rpc error: code = NotFound desc = could not find container \"8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65\": container with ID starting with 8c0d869ccc5f4abe51fca3df3ad68d6f1cb3c10be54e7c36d072544561e07a65 not found: ID does not exist" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.686409 4962 scope.go:117] "RemoveContainer" containerID="f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60" Oct 03 13:14:04 crc kubenswrapper[4962]: E1003 13:14:04.686926 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60\": container with ID starting with f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60 not found: ID does not exist" containerID="f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60" Oct 03 13:14:04 crc kubenswrapper[4962]: I1003 13:14:04.686957 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60"} err="failed to get container status \"f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60\": rpc error: code = NotFound desc = could not find container \"f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60\": container with ID starting with f40f36c7eaa6f57aaa9cbedd298463d367ad74213b43b55760fc04d0f2ef5d60 not found: ID does not exist" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.306698 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.418837 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zndn\" (UniqueName: \"kubernetes.io/projected/1c291303-c69e-44f7-9afa-15d3964eff4d-kube-api-access-9zndn\") pod \"1c291303-c69e-44f7-9afa-15d3964eff4d\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.418922 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data-custom\") pod \"1c291303-c69e-44f7-9afa-15d3964eff4d\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.418956 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c291303-c69e-44f7-9afa-15d3964eff4d-etc-machine-id\") pod \"1c291303-c69e-44f7-9afa-15d3964eff4d\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.419005 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-combined-ca-bundle\") pod \"1c291303-c69e-44f7-9afa-15d3964eff4d\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.419031 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-scripts\") pod \"1c291303-c69e-44f7-9afa-15d3964eff4d\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.419071 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data\") pod \"1c291303-c69e-44f7-9afa-15d3964eff4d\" (UID: \"1c291303-c69e-44f7-9afa-15d3964eff4d\") " Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.419349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c291303-c69e-44f7-9afa-15d3964eff4d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1c291303-c69e-44f7-9afa-15d3964eff4d" (UID: "1c291303-c69e-44f7-9afa-15d3964eff4d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.419851 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c291303-c69e-44f7-9afa-15d3964eff4d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.423879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c291303-c69e-44f7-9afa-15d3964eff4d" (UID: "1c291303-c69e-44f7-9afa-15d3964eff4d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.424136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-scripts" (OuterVolumeSpecName: "scripts") pod "1c291303-c69e-44f7-9afa-15d3964eff4d" (UID: "1c291303-c69e-44f7-9afa-15d3964eff4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.434826 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c291303-c69e-44f7-9afa-15d3964eff4d-kube-api-access-9zndn" (OuterVolumeSpecName: "kube-api-access-9zndn") pod "1c291303-c69e-44f7-9afa-15d3964eff4d" (UID: "1c291303-c69e-44f7-9afa-15d3964eff4d"). InnerVolumeSpecName "kube-api-access-9zndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.486130 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c291303-c69e-44f7-9afa-15d3964eff4d" (UID: "1c291303-c69e-44f7-9afa-15d3964eff4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.521031 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.521063 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.521072 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zndn\" (UniqueName: \"kubernetes.io/projected/1c291303-c69e-44f7-9afa-15d3964eff4d-kube-api-access-9zndn\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.521082 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.536854 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data" (OuterVolumeSpecName: "config-data") pod "1c291303-c69e-44f7-9afa-15d3964eff4d" (UID: "1c291303-c69e-44f7-9afa-15d3964eff4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.552246 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.552786 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerName="glance-httpd" containerID="cri-o://a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66" gracePeriod=30 Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.552521 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerName="glance-log" containerID="cri-o://f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d" gracePeriod=30 Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.622803 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c291303-c69e-44f7-9afa-15d3964eff4d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.641154 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c291303-c69e-44f7-9afa-15d3964eff4d","Type":"ContainerDied","Data":"761b5cbb574dce3d2778e7a928d3cfe4290c60311663b20dafa944e152065246"} Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.641191 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.641223 4962 scope.go:117] "RemoveContainer" containerID="51e10bccf5b133b42d415f9798ef5913e91b1aa2310fe64e85b079770a169a54" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.672968 4962 scope.go:117] "RemoveContainer" containerID="db4cffdea214f8319fe9cef9b03adeb1c68d4302a4bf66714632575f4186097f" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.674347 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.682831 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.703732 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:14:05 crc kubenswrapper[4962]: E1003 13:14:05.704205 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerName="probe" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704232 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerName="probe" Oct 03 13:14:05 crc kubenswrapper[4962]: E1003 13:14:05.704245 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerName="cinder-scheduler" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704255 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerName="cinder-scheduler" Oct 03 13:14:05 crc kubenswrapper[4962]: E1003 13:14:05.704281 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" containerName="dnsmasq-dns" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704291 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" containerName="dnsmasq-dns" Oct 03 13:14:05 crc kubenswrapper[4962]: E1003 13:14:05.704308 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerName="neutron-api" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704317 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerName="neutron-api" Oct 03 13:14:05 crc kubenswrapper[4962]: E1003 13:14:05.704329 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" containerName="init" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704337 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" containerName="init" Oct 03 13:14:05 crc kubenswrapper[4962]: E1003 13:14:05.704358 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerName="neutron-httpd" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704366 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerName="neutron-httpd" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704600 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4fd76b-f6c7-4281-8958-e5bfe73c7ba3" containerName="dnsmasq-dns" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704634 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerName="neutron-httpd" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704649 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerName="probe" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704686 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" containerName="cinder-scheduler" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.704700 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" containerName="neutron-api" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.707070 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.709232 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.724319 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.825603 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.825677 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/009b2959-1113-4574-a2ec-90bbe2d8f8ef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.825701 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld8cq\" (UniqueName: \"kubernetes.io/projected/009b2959-1113-4574-a2ec-90bbe2d8f8ef-kube-api-access-ld8cq\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.825731 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.825748 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.825886 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-scripts\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.832401 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.926882 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.926925 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.927033 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-scripts\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.927076 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.927101 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/009b2959-1113-4574-a2ec-90bbe2d8f8ef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.927121 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld8cq\" (UniqueName: \"kubernetes.io/projected/009b2959-1113-4574-a2ec-90bbe2d8f8ef-kube-api-access-ld8cq\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.927213 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/009b2959-1113-4574-a2ec-90bbe2d8f8ef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.931177 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.931932 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.933095 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-scripts\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.940203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:05 crc kubenswrapper[4962]: I1003 13:14:05.950873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld8cq\" (UniqueName: \"kubernetes.io/projected/009b2959-1113-4574-a2ec-90bbe2d8f8ef-kube-api-access-ld8cq\") pod \"cinder-scheduler-0\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " pod="openstack/cinder-scheduler-0" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.024084 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.111002 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6963-account-create-gqhqw"] Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.112364 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6963-account-create-gqhqw" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.115356 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.126287 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6963-account-create-gqhqw"] Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.231198 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsgz9\" (UniqueName: \"kubernetes.io/projected/f7872ef4-f86f-430d-b77e-3249dbda3a80-kube-api-access-hsgz9\") pod \"nova-api-6963-account-create-gqhqw\" (UID: \"f7872ef4-f86f-430d-b77e-3249dbda3a80\") " pod="openstack/nova-api-6963-account-create-gqhqw" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.239951 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c291303-c69e-44f7-9afa-15d3964eff4d" path="/var/lib/kubelet/pods/1c291303-c69e-44f7-9afa-15d3964eff4d/volumes" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.240738 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8103ea-25b6-4a47-bb63-ac20d8fef59f" path="/var/lib/kubelet/pods/7f8103ea-25b6-4a47-bb63-ac20d8fef59f/volumes" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.309301 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b903-account-create-gkqkx"] Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.310529 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b903-account-create-gkqkx" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.312685 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.323235 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b903-account-create-gkqkx"] Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.332218 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsgz9\" (UniqueName: \"kubernetes.io/projected/f7872ef4-f86f-430d-b77e-3249dbda3a80-kube-api-access-hsgz9\") pod \"nova-api-6963-account-create-gqhqw\" (UID: \"f7872ef4-f86f-430d-b77e-3249dbda3a80\") " pod="openstack/nova-api-6963-account-create-gqhqw" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.352467 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsgz9\" (UniqueName: \"kubernetes.io/projected/f7872ef4-f86f-430d-b77e-3249dbda3a80-kube-api-access-hsgz9\") pod \"nova-api-6963-account-create-gqhqw\" (UID: \"f7872ef4-f86f-430d-b77e-3249dbda3a80\") " pod="openstack/nova-api-6963-account-create-gqhqw" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.434756 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gm7g\" (UniqueName: \"kubernetes.io/projected/4129f89f-a08d-4374-9661-15e30b59b01f-kube-api-access-4gm7g\") pod \"nova-cell0-b903-account-create-gkqkx\" (UID: \"4129f89f-a08d-4374-9661-15e30b59b01f\") " pod="openstack/nova-cell0-b903-account-create-gkqkx" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.465391 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6963-account-create-gqhqw" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.503006 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.520809 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d10f-account-create-nxl99"] Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.521992 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d10f-account-create-nxl99" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.527879 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.529207 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d10f-account-create-nxl99"] Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.542606 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gm7g\" (UniqueName: \"kubernetes.io/projected/4129f89f-a08d-4374-9661-15e30b59b01f-kube-api-access-4gm7g\") pod \"nova-cell0-b903-account-create-gkqkx\" (UID: \"4129f89f-a08d-4374-9661-15e30b59b01f\") " pod="openstack/nova-cell0-b903-account-create-gkqkx" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.560109 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gm7g\" (UniqueName: \"kubernetes.io/projected/4129f89f-a08d-4374-9661-15e30b59b01f-kube-api-access-4gm7g\") pod \"nova-cell0-b903-account-create-gkqkx\" (UID: \"4129f89f-a08d-4374-9661-15e30b59b01f\") " pod="openstack/nova-cell0-b903-account-create-gkqkx" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.641135 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b903-account-create-gkqkx" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.643599 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks64w\" (UniqueName: \"kubernetes.io/projected/0fbdbc1d-a3bc-4bd3-80aa-a590977a4329-kube-api-access-ks64w\") pod \"nova-cell1-d10f-account-create-nxl99\" (UID: \"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329\") " pod="openstack/nova-cell1-d10f-account-create-nxl99" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.663064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"009b2959-1113-4574-a2ec-90bbe2d8f8ef","Type":"ContainerStarted","Data":"639573e4e77df17686b8a4d8a374499b9c547d73dad7ab52276d6246560eb3cc"} Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.665272 4962 generic.go:334] "Generic (PLEG): container finished" podID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerID="f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d" exitCode=143 Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.665316 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ef04f91-585b-4d7a-8847-c83ec719fa7f","Type":"ContainerDied","Data":"f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d"} Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.745772 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks64w\" (UniqueName: \"kubernetes.io/projected/0fbdbc1d-a3bc-4bd3-80aa-a590977a4329-kube-api-access-ks64w\") pod \"nova-cell1-d10f-account-create-nxl99\" (UID: \"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329\") " pod="openstack/nova-cell1-d10f-account-create-nxl99" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.766930 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks64w\" (UniqueName: \"kubernetes.io/projected/0fbdbc1d-a3bc-4bd3-80aa-a590977a4329-kube-api-access-ks64w\") pod \"nova-cell1-d10f-account-create-nxl99\" (UID: \"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329\") " pod="openstack/nova-cell1-d10f-account-create-nxl99" Oct 03 13:14:06 crc kubenswrapper[4962]: I1003 13:14:06.850220 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d10f-account-create-nxl99" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.015864 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6963-account-create-gqhqw"] Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.222251 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.258220 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-combined-ca-bundle\") pod \"c36ae572-9009-4126-88fa-a27e232e4332\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.258295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c36ae572-9009-4126-88fa-a27e232e4332\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.258395 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-public-tls-certs\") pod \"c36ae572-9009-4126-88fa-a27e232e4332\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.258427 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l4pr\" (UniqueName: \"kubernetes.io/projected/c36ae572-9009-4126-88fa-a27e232e4332-kube-api-access-9l4pr\") pod \"c36ae572-9009-4126-88fa-a27e232e4332\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.258455 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-httpd-run\") pod \"c36ae572-9009-4126-88fa-a27e232e4332\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.258523 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-scripts\") pod \"c36ae572-9009-4126-88fa-a27e232e4332\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.258539 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-logs\") pod \"c36ae572-9009-4126-88fa-a27e232e4332\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.258619 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-config-data\") pod \"c36ae572-9009-4126-88fa-a27e232e4332\" (UID: \"c36ae572-9009-4126-88fa-a27e232e4332\") " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.259251 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c36ae572-9009-4126-88fa-a27e232e4332" (UID: "c36ae572-9009-4126-88fa-a27e232e4332"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.266900 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-scripts" (OuterVolumeSpecName: "scripts") pod "c36ae572-9009-4126-88fa-a27e232e4332" (UID: "c36ae572-9009-4126-88fa-a27e232e4332"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.267094 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-logs" (OuterVolumeSpecName: "logs") pod "c36ae572-9009-4126-88fa-a27e232e4332" (UID: "c36ae572-9009-4126-88fa-a27e232e4332"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.269760 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c36ae572-9009-4126-88fa-a27e232e4332" (UID: "c36ae572-9009-4126-88fa-a27e232e4332"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.284062 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c36ae572-9009-4126-88fa-a27e232e4332-kube-api-access-9l4pr" (OuterVolumeSpecName: "kube-api-access-9l4pr") pod "c36ae572-9009-4126-88fa-a27e232e4332" (UID: "c36ae572-9009-4126-88fa-a27e232e4332"). InnerVolumeSpecName "kube-api-access-9l4pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.334878 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c36ae572-9009-4126-88fa-a27e232e4332" (UID: "c36ae572-9009-4126-88fa-a27e232e4332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.360537 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.360578 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.360591 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l4pr\" (UniqueName: \"kubernetes.io/projected/c36ae572-9009-4126-88fa-a27e232e4332-kube-api-access-9l4pr\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.360607 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.360616 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.360626 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c36ae572-9009-4126-88fa-a27e232e4332-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.386752 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-config-data" (OuterVolumeSpecName: "config-data") pod "c36ae572-9009-4126-88fa-a27e232e4332" (UID: "c36ae572-9009-4126-88fa-a27e232e4332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.431129 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c36ae572-9009-4126-88fa-a27e232e4332" (UID: "c36ae572-9009-4126-88fa-a27e232e4332"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.459695 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.461856 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.461892 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.461905 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36ae572-9009-4126-88fa-a27e232e4332-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.473136 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b903-account-create-gkqkx"] Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.480423 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d10f-account-create-nxl99"] Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.688304 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b903-account-create-gkqkx" event={"ID":"4129f89f-a08d-4374-9661-15e30b59b01f","Type":"ContainerStarted","Data":"26ea37844a3fdbef6bd439665981583a3e5b323f9b6bca65d529332f1bdd10fe"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.688687 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b903-account-create-gkqkx" event={"ID":"4129f89f-a08d-4374-9661-15e30b59b01f","Type":"ContainerStarted","Data":"d4af2501763fb3a3638fbb8d69cb17c71d9ea41a230b9897ff59f78cf0bf1fe6"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.695066 4962 generic.go:334] "Generic (PLEG): container finished" podID="c36ae572-9009-4126-88fa-a27e232e4332" containerID="d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3" exitCode=0 Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.695130 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c36ae572-9009-4126-88fa-a27e232e4332","Type":"ContainerDied","Data":"d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.695159 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c36ae572-9009-4126-88fa-a27e232e4332","Type":"ContainerDied","Data":"4337aa2673fa3d3964e58b850142a7a4a4fadcdf2ed67191370b0473cc88bbbc"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.695177 4962 scope.go:117] "RemoveContainer" containerID="d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.695334 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.705069 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-b903-account-create-gkqkx" podStartSLOduration=1.7050521490000001 podStartE2EDuration="1.705052149s" podCreationTimestamp="2025-10-03 13:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:07.704526235 +0000 UTC m=+1456.108424070" watchObservedRunningTime="2025-10-03 13:14:07.705052149 +0000 UTC m=+1456.108949984" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.714859 4962 generic.go:334] "Generic (PLEG): container finished" podID="f7872ef4-f86f-430d-b77e-3249dbda3a80" containerID="580d989f2c0e944c9c1d2f5e5caf3bb4c1a2c9b700143ae9cbb8802fec699c5c" exitCode=0 Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.714960 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6963-account-create-gqhqw" event={"ID":"f7872ef4-f86f-430d-b77e-3249dbda3a80","Type":"ContainerDied","Data":"580d989f2c0e944c9c1d2f5e5caf3bb4c1a2c9b700143ae9cbb8802fec699c5c"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.714987 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6963-account-create-gqhqw" event={"ID":"f7872ef4-f86f-430d-b77e-3249dbda3a80","Type":"ContainerStarted","Data":"048d4de6d15afd9202a3d74f3a3ad259fb513d8f3771bf36f26150c0cc454d5c"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.716953 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d10f-account-create-nxl99" event={"ID":"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329","Type":"ContainerStarted","Data":"bba8dbd73d396cb05609d66b70ee7132d3c6fc727cd79d137e8dcac91a81f9d5"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.716975 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d10f-account-create-nxl99" event={"ID":"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329","Type":"ContainerStarted","Data":"6b12d0108bdd25250dbadd0639505a8740f970d7699dce12ffdc1723db0a862e"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.723308 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"009b2959-1113-4574-a2ec-90bbe2d8f8ef","Type":"ContainerStarted","Data":"8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad"} Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.723488 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="proxy-httpd" containerID="cri-o://401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b" gracePeriod=30 Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.723531 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="sg-core" containerID="cri-o://4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01" gracePeriod=30 Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.723595 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="ceilometer-notification-agent" containerID="cri-o://5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85" gracePeriod=30 Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.723488 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="ceilometer-central-agent" containerID="cri-o://8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8" gracePeriod=30 Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.731059 4962 scope.go:117] "RemoveContainer" containerID="139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.772773 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-d10f-account-create-nxl99" podStartSLOduration=1.772752077 podStartE2EDuration="1.772752077s" podCreationTimestamp="2025-10-03 13:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:07.7509107 +0000 UTC m=+1456.154808535" watchObservedRunningTime="2025-10-03 13:14:07.772752077 +0000 UTC m=+1456.176649912" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.805833 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.817684 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.830149 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:14:07 crc kubenswrapper[4962]: E1003 13:14:07.830598 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36ae572-9009-4126-88fa-a27e232e4332" containerName="glance-httpd" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.830612 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36ae572-9009-4126-88fa-a27e232e4332" containerName="glance-httpd" Oct 03 13:14:07 crc kubenswrapper[4962]: E1003 13:14:07.830625 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36ae572-9009-4126-88fa-a27e232e4332" containerName="glance-log" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.830668 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36ae572-9009-4126-88fa-a27e232e4332" containerName="glance-log" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.830921 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36ae572-9009-4126-88fa-a27e232e4332" containerName="glance-httpd" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.830949 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36ae572-9009-4126-88fa-a27e232e4332" containerName="glance-log" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.832085 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.833980 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.834430 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.837126 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.976507 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.976571 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-logs\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.976614 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbh7\" (UniqueName: \"kubernetes.io/projected/b0da1427-1e89-42d6-beb2-55f292945177-kube-api-access-4wbh7\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.976659 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-scripts\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.976675 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-config-data\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.976896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.976917 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.976981 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.981716 4962 scope.go:117] "RemoveContainer" containerID="d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3" Oct 03 13:14:07 crc kubenswrapper[4962]: E1003 13:14:07.982871 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3\": container with ID starting with d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3 not found: ID does not exist" containerID="d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.982907 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3"} err="failed to get container status \"d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3\": rpc error: code = NotFound desc = could not find container \"d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3\": container with ID starting with d0cd9caf8e6e9471128d0623d8a1ced72cffa3d1a6deef4c224f22b7f1b042d3 not found: ID does not exist" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.982932 4962 scope.go:117] "RemoveContainer" containerID="139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9" Oct 03 13:14:07 crc kubenswrapper[4962]: E1003 13:14:07.987167 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9\": container with ID starting with 139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9 not found: ID does not exist" containerID="139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9" Oct 03 13:14:07 crc kubenswrapper[4962]: I1003 13:14:07.987196 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9"} err="failed to get container status \"139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9\": rpc error: code = NotFound desc = could not find container \"139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9\": container with ID starting with 139f36e43c8b266edf0419b31506529f51836bdb805e787a62f450fc35dbfdd9 not found: ID does not exist" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.078356 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.078412 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.078459 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.078510 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.078575 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-logs\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.078608 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbh7\" (UniqueName: \"kubernetes.io/projected/b0da1427-1e89-42d6-beb2-55f292945177-kube-api-access-4wbh7\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.078637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-scripts\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.078671 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-config-data\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.079725 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.080515 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-logs\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.080867 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.091971 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-scripts\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.096407 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.097297 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-config-data\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.109522 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbh7\" (UniqueName: \"kubernetes.io/projected/b0da1427-1e89-42d6-beb2-55f292945177-kube-api-access-4wbh7\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.110794 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.172425 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.241122 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c36ae572-9009-4126-88fa-a27e232e4332" path="/var/lib/kubelet/pods/c36ae572-9009-4126-88fa-a27e232e4332/volumes" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.257076 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.504091 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.752695 4962 generic.go:334] "Generic (PLEG): container finished" podID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerID="401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b" exitCode=0 Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.752738 4962 generic.go:334] "Generic (PLEG): container finished" podID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerID="4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01" exitCode=2 Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.752748 4962 generic.go:334] "Generic (PLEG): container finished" podID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerID="5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85" exitCode=0 Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.752832 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerDied","Data":"401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b"} Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.752865 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerDied","Data":"4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01"} Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.752877 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerDied","Data":"5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85"} Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.760562 4962 generic.go:334] "Generic (PLEG): container finished" podID="0fbdbc1d-a3bc-4bd3-80aa-a590977a4329" containerID="bba8dbd73d396cb05609d66b70ee7132d3c6fc727cd79d137e8dcac91a81f9d5" exitCode=0 Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.760617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d10f-account-create-nxl99" event={"ID":"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329","Type":"ContainerDied","Data":"bba8dbd73d396cb05609d66b70ee7132d3c6fc727cd79d137e8dcac91a81f9d5"} Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.788202 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"009b2959-1113-4574-a2ec-90bbe2d8f8ef","Type":"ContainerStarted","Data":"ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261"} Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.798124 4962 generic.go:334] "Generic (PLEG): container finished" podID="4129f89f-a08d-4374-9661-15e30b59b01f" containerID="26ea37844a3fdbef6bd439665981583a3e5b323f9b6bca65d529332f1bdd10fe" exitCode=0 Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.798208 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b903-account-create-gkqkx" event={"ID":"4129f89f-a08d-4374-9661-15e30b59b01f","Type":"ContainerDied","Data":"26ea37844a3fdbef6bd439665981583a3e5b323f9b6bca65d529332f1bdd10fe"} Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.822817 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.8227975819999997 podStartE2EDuration="3.822797582s" podCreationTimestamp="2025-10-03 13:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:08.812770002 +0000 UTC m=+1457.216667857" watchObservedRunningTime="2025-10-03 13:14:08.822797582 +0000 UTC m=+1457.226695417" Oct 03 13:14:08 crc kubenswrapper[4962]: I1003 13:14:08.848564 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.247761 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6963-account-create-gqhqw" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.335614 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.410350 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsgz9\" (UniqueName: \"kubernetes.io/projected/f7872ef4-f86f-430d-b77e-3249dbda3a80-kube-api-access-hsgz9\") pod \"f7872ef4-f86f-430d-b77e-3249dbda3a80\" (UID: \"f7872ef4-f86f-430d-b77e-3249dbda3a80\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.422601 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7872ef4-f86f-430d-b77e-3249dbda3a80-kube-api-access-hsgz9" (OuterVolumeSpecName: "kube-api-access-hsgz9") pod "f7872ef4-f86f-430d-b77e-3249dbda3a80" (UID: "f7872ef4-f86f-430d-b77e-3249dbda3a80"). InnerVolumeSpecName "kube-api-access-hsgz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.512634 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-scripts\") pod \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.512681 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-httpd-run\") pod \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.512714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-combined-ca-bundle\") pod \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.512753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-config-data\") pod \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.512842 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-internal-tls-certs\") pod \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.512913 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tl94\" (UniqueName: \"kubernetes.io/projected/8ef04f91-585b-4d7a-8847-c83ec719fa7f-kube-api-access-5tl94\") pod \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.512981 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-logs\") pod \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.513025 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\" (UID: \"8ef04f91-585b-4d7a-8847-c83ec719fa7f\") " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.513191 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ef04f91-585b-4d7a-8847-c83ec719fa7f" (UID: "8ef04f91-585b-4d7a-8847-c83ec719fa7f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.513712 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsgz9\" (UniqueName: \"kubernetes.io/projected/f7872ef4-f86f-430d-b77e-3249dbda3a80-kube-api-access-hsgz9\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.513741 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.516638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "8ef04f91-585b-4d7a-8847-c83ec719fa7f" (UID: "8ef04f91-585b-4d7a-8847-c83ec719fa7f"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.518987 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-scripts" (OuterVolumeSpecName: "scripts") pod "8ef04f91-585b-4d7a-8847-c83ec719fa7f" (UID: "8ef04f91-585b-4d7a-8847-c83ec719fa7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.519326 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-logs" (OuterVolumeSpecName: "logs") pod "8ef04f91-585b-4d7a-8847-c83ec719fa7f" (UID: "8ef04f91-585b-4d7a-8847-c83ec719fa7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.529188 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef04f91-585b-4d7a-8847-c83ec719fa7f-kube-api-access-5tl94" (OuterVolumeSpecName: "kube-api-access-5tl94") pod "8ef04f91-585b-4d7a-8847-c83ec719fa7f" (UID: "8ef04f91-585b-4d7a-8847-c83ec719fa7f"). InnerVolumeSpecName "kube-api-access-5tl94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.559100 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ef04f91-585b-4d7a-8847-c83ec719fa7f" (UID: "8ef04f91-585b-4d7a-8847-c83ec719fa7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.598604 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-config-data" (OuterVolumeSpecName: "config-data") pod "8ef04f91-585b-4d7a-8847-c83ec719fa7f" (UID: "8ef04f91-585b-4d7a-8847-c83ec719fa7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.600752 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ef04f91-585b-4d7a-8847-c83ec719fa7f" (UID: "8ef04f91-585b-4d7a-8847-c83ec719fa7f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.615576 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.615608 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tl94\" (UniqueName: \"kubernetes.io/projected/8ef04f91-585b-4d7a-8847-c83ec719fa7f-kube-api-access-5tl94\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.615619 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef04f91-585b-4d7a-8847-c83ec719fa7f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.615665 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.615675 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.615682 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.615691 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef04f91-585b-4d7a-8847-c83ec719fa7f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.682825 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.717513 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.844303 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6963-account-create-gqhqw" event={"ID":"f7872ef4-f86f-430d-b77e-3249dbda3a80","Type":"ContainerDied","Data":"048d4de6d15afd9202a3d74f3a3ad259fb513d8f3771bf36f26150c0cc454d5c"} Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.844614 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="048d4de6d15afd9202a3d74f3a3ad259fb513d8f3771bf36f26150c0cc454d5c" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.844731 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6963-account-create-gqhqw" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.853317 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b0da1427-1e89-42d6-beb2-55f292945177","Type":"ContainerStarted","Data":"b27da2f01290ecf61072efad87218e000a6819ad0aac516d4d56189f22787d6c"} Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.853577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b0da1427-1e89-42d6-beb2-55f292945177","Type":"ContainerStarted","Data":"f2a67da5b67156e26f78a02bc4d28c3835e8ad9566e2ba809f777e9f3545c5c8"} Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.864901 4962 generic.go:334] "Generic (PLEG): container finished" podID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerID="a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66" exitCode=0 Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.865846 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.865922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ef04f91-585b-4d7a-8847-c83ec719fa7f","Type":"ContainerDied","Data":"a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66"} Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.865980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ef04f91-585b-4d7a-8847-c83ec719fa7f","Type":"ContainerDied","Data":"4e862bef3957222a2a9336bde6e4080b19d7312e422bd0524279be5ec6432c2b"} Oct 03 13:14:09 crc kubenswrapper[4962]: I1003 13:14:09.866001 4962 scope.go:117] "RemoveContainer" containerID="a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.013831 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.051668 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.096287 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:14:10 crc kubenswrapper[4962]: E1003 13:14:10.096705 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerName="glance-httpd" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.096717 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerName="glance-httpd" Oct 03 13:14:10 crc kubenswrapper[4962]: E1003 13:14:10.096729 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7872ef4-f86f-430d-b77e-3249dbda3a80" containerName="mariadb-account-create" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.096735 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7872ef4-f86f-430d-b77e-3249dbda3a80" containerName="mariadb-account-create" Oct 03 13:14:10 crc kubenswrapper[4962]: E1003 13:14:10.096770 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerName="glance-log" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.096777 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerName="glance-log" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.096944 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7872ef4-f86f-430d-b77e-3249dbda3a80" containerName="mariadb-account-create" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.096960 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerName="glance-log" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.096978 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" containerName="glance-httpd" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.098025 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.104344 4962 scope.go:117] "RemoveContainer" containerID="f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.107949 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.109631 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.110885 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.137730 4962 scope.go:117] "RemoveContainer" containerID="a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66" Oct 03 13:14:10 crc kubenswrapper[4962]: E1003 13:14:10.138247 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66\": container with ID starting with a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66 not found: ID does not exist" containerID="a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.138280 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66"} err="failed to get container status \"a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66\": rpc error: code = NotFound desc = could not find container \"a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66\": container with ID starting with a00fe3f1a213a69f33c828fec39d72830ee8291769de616332e0a9feff2f1c66 not found: ID does not exist" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.138307 4962 scope.go:117] "RemoveContainer" containerID="f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d" Oct 03 13:14:10 crc kubenswrapper[4962]: E1003 13:14:10.140370 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d\": container with ID starting with f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d not found: ID does not exist" containerID="f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.140420 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d"} err="failed to get container status \"f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d\": rpc error: code = NotFound desc = could not find container \"f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d\": container with ID starting with f738f09dad1b33525cc99c16d86a7af3e15f5062109ba5f976f205a990dd451d not found: ID does not exist" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.238658 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.238992 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.239050 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpplp\" (UniqueName: \"kubernetes.io/projected/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-kube-api-access-lpplp\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.239077 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.239112 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.239148 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.239168 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.239183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-logs\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.261603 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef04f91-585b-4d7a-8847-c83ec719fa7f" path="/var/lib/kubelet/pods/8ef04f91-585b-4d7a-8847-c83ec719fa7f/volumes" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.340180 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.340415 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.340468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.340488 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-logs\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.340504 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.340569 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.340598 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.340682 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpplp\" (UniqueName: \"kubernetes.io/projected/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-kube-api-access-lpplp\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.341832 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.343480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-logs\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.343973 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.355395 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.357205 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.358467 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.362990 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.363163 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpplp\" (UniqueName: \"kubernetes.io/projected/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-kube-api-access-lpplp\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.385746 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.427355 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.430367 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b903-account-create-gkqkx" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.554563 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d10f-account-create-nxl99" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.578152 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gm7g\" (UniqueName: \"kubernetes.io/projected/4129f89f-a08d-4374-9661-15e30b59b01f-kube-api-access-4gm7g\") pod \"4129f89f-a08d-4374-9661-15e30b59b01f\" (UID: \"4129f89f-a08d-4374-9661-15e30b59b01f\") " Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.586124 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4129f89f-a08d-4374-9661-15e30b59b01f-kube-api-access-4gm7g" (OuterVolumeSpecName: "kube-api-access-4gm7g") pod "4129f89f-a08d-4374-9661-15e30b59b01f" (UID: "4129f89f-a08d-4374-9661-15e30b59b01f"). InnerVolumeSpecName "kube-api-access-4gm7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.679371 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks64w\" (UniqueName: \"kubernetes.io/projected/0fbdbc1d-a3bc-4bd3-80aa-a590977a4329-kube-api-access-ks64w\") pod \"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329\" (UID: \"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329\") " Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.680132 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gm7g\" (UniqueName: \"kubernetes.io/projected/4129f89f-a08d-4374-9661-15e30b59b01f-kube-api-access-4gm7g\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.692803 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbdbc1d-a3bc-4bd3-80aa-a590977a4329-kube-api-access-ks64w" (OuterVolumeSpecName: "kube-api-access-ks64w") pod "0fbdbc1d-a3bc-4bd3-80aa-a590977a4329" (UID: "0fbdbc1d-a3bc-4bd3-80aa-a590977a4329"). InnerVolumeSpecName "kube-api-access-ks64w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.787016 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks64w\" (UniqueName: \"kubernetes.io/projected/0fbdbc1d-a3bc-4bd3-80aa-a590977a4329-kube-api-access-ks64w\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.876258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b0da1427-1e89-42d6-beb2-55f292945177","Type":"ContainerStarted","Data":"c5177f0f305f7d8efd50064ca1ae9320ecca80819662d7f114a735ed39509584"} Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.882327 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d10f-account-create-nxl99" event={"ID":"0fbdbc1d-a3bc-4bd3-80aa-a590977a4329","Type":"ContainerDied","Data":"6b12d0108bdd25250dbadd0639505a8740f970d7699dce12ffdc1723db0a862e"} Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.882548 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b12d0108bdd25250dbadd0639505a8740f970d7699dce12ffdc1723db0a862e" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.882342 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d10f-account-create-nxl99" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.886153 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b903-account-create-gkqkx" event={"ID":"4129f89f-a08d-4374-9661-15e30b59b01f","Type":"ContainerDied","Data":"d4af2501763fb3a3638fbb8d69cb17c71d9ea41a230b9897ff59f78cf0bf1fe6"} Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.886194 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4af2501763fb3a3638fbb8d69cb17c71d9ea41a230b9897ff59f78cf0bf1fe6" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.886259 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b903-account-create-gkqkx" Oct 03 13:14:10 crc kubenswrapper[4962]: I1003 13:14:10.902122 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.902102951 podStartE2EDuration="3.902102951s" podCreationTimestamp="2025-10-03 13:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:10.89944906 +0000 UTC m=+1459.303346895" watchObservedRunningTime="2025-10-03 13:14:10.902102951 +0000 UTC m=+1459.306000786" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.025218 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 13:14:11 crc kubenswrapper[4962]: W1003 13:14:11.041155 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcea3d32c_24c3_4a80_a1fb_ad65be7bbba6.slice/crio-0c2c25b66b133fc697df6049c4530b5fe754ab1804e372662ca5bb9256497054 WatchSource:0}: Error finding container 0c2c25b66b133fc697df6049c4530b5fe754ab1804e372662ca5bb9256497054: Status 404 returned error can't find the container with id 0c2c25b66b133fc697df6049c4530b5fe754ab1804e372662ca5bb9256497054 Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.043203 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.815090 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bz42z"] Oct 03 13:14:11 crc kubenswrapper[4962]: E1003 13:14:11.815833 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fbdbc1d-a3bc-4bd3-80aa-a590977a4329" containerName="mariadb-account-create" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.815846 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbdbc1d-a3bc-4bd3-80aa-a590977a4329" containerName="mariadb-account-create" Oct 03 13:14:11 crc kubenswrapper[4962]: E1003 13:14:11.815871 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4129f89f-a08d-4374-9661-15e30b59b01f" containerName="mariadb-account-create" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.815877 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4129f89f-a08d-4374-9661-15e30b59b01f" containerName="mariadb-account-create" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.816038 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4129f89f-a08d-4374-9661-15e30b59b01f" containerName="mariadb-account-create" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.816061 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fbdbc1d-a3bc-4bd3-80aa-a590977a4329" containerName="mariadb-account-create" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.817260 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.916704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6","Type":"ContainerStarted","Data":"ac73c42dc924c54bc8c349e88f72de0cd595955349b0de465efb0b5629a1c596"} Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.918577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6","Type":"ContainerStarted","Data":"0c2c25b66b133fc697df6049c4530b5fe754ab1804e372662ca5bb9256497054"} Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.927712 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-catalog-content\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.927771 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvqr\" (UniqueName: \"kubernetes.io/projected/b60070ef-b284-466e-b596-9ea66a2b8896-kube-api-access-mfvqr\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.927826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-utilities\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:11 crc kubenswrapper[4962]: I1003 13:14:11.936686 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bz42z"] Oct 03 13:14:12 crc kubenswrapper[4962]: E1003 13:14:12.026229 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9e6c89_714e_4efd_9adc_c15cd5b3eb6b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.029733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-catalog-content\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.029895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvqr\" (UniqueName: \"kubernetes.io/projected/b60070ef-b284-466e-b596-9ea66a2b8896-kube-api-access-mfvqr\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.030027 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-utilities\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.030557 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-utilities\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.031057 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-catalog-content\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.059252 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvqr\" (UniqueName: \"kubernetes.io/projected/b60070ef-b284-466e-b596-9ea66a2b8896-kube-api-access-mfvqr\") pod \"community-operators-bz42z\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.313183 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.739830 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.852241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xxh9\" (UniqueName: \"kubernetes.io/projected/7d947c9f-2cbc-428e-9464-ff2b560fe91c-kube-api-access-8xxh9\") pod \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.852343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-config-data\") pod \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.852366 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-log-httpd\") pod \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.852402 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-scripts\") pod \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.852438 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-sg-core-conf-yaml\") pod \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.852472 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-run-httpd\") pod \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.852503 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-combined-ca-bundle\") pod \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\" (UID: \"7d947c9f-2cbc-428e-9464-ff2b560fe91c\") " Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.854129 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7d947c9f-2cbc-428e-9464-ff2b560fe91c" (UID: "7d947c9f-2cbc-428e-9464-ff2b560fe91c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.855349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7d947c9f-2cbc-428e-9464-ff2b560fe91c" (UID: "7d947c9f-2cbc-428e-9464-ff2b560fe91c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.860156 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-scripts" (OuterVolumeSpecName: "scripts") pod "7d947c9f-2cbc-428e-9464-ff2b560fe91c" (UID: "7d947c9f-2cbc-428e-9464-ff2b560fe91c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.874913 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d947c9f-2cbc-428e-9464-ff2b560fe91c-kube-api-access-8xxh9" (OuterVolumeSpecName: "kube-api-access-8xxh9") pod "7d947c9f-2cbc-428e-9464-ff2b560fe91c" (UID: "7d947c9f-2cbc-428e-9464-ff2b560fe91c"). InnerVolumeSpecName "kube-api-access-8xxh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.884458 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7d947c9f-2cbc-428e-9464-ff2b560fe91c" (UID: "7d947c9f-2cbc-428e-9464-ff2b560fe91c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.954064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6","Type":"ContainerStarted","Data":"7377486e2b45b77a0115044df6ada84243926afe00f2d9d4d7481b7f35b3b3cd"} Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.955460 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xxh9\" (UniqueName: \"kubernetes.io/projected/7d947c9f-2cbc-428e-9464-ff2b560fe91c-kube-api-access-8xxh9\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.955485 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.955493 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.955502 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.955510 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d947c9f-2cbc-428e-9464-ff2b560fe91c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:12 crc kubenswrapper[4962]: W1003 13:14:12.957428 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb60070ef_b284_466e_b596_9ea66a2b8896.slice/crio-ded065bb223ef1e43793dd25e7d3a6e112a05eba87a19287fb95c67de9500154 WatchSource:0}: Error finding container ded065bb223ef1e43793dd25e7d3a6e112a05eba87a19287fb95c67de9500154: Status 404 returned error can't find the container with id ded065bb223ef1e43793dd25e7d3a6e112a05eba87a19287fb95c67de9500154 Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.960144 4962 generic.go:334] "Generic (PLEG): container finished" podID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerID="8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8" exitCode=0 Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.960199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerDied","Data":"8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8"} Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.960224 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d947c9f-2cbc-428e-9464-ff2b560fe91c","Type":"ContainerDied","Data":"7d89596962906130eb75f2a94a53f91944508ebcfc318883bd8a1b020fae74f9"} Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.960240 4962 scope.go:117] "RemoveContainer" containerID="401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.960427 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.960615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d947c9f-2cbc-428e-9464-ff2b560fe91c" (UID: "7d947c9f-2cbc-428e-9464-ff2b560fe91c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.961571 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bz42z"] Oct 03 13:14:12 crc kubenswrapper[4962]: I1003 13:14:12.976481 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.97646589 podStartE2EDuration="2.97646589s" podCreationTimestamp="2025-10-03 13:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:12.975536015 +0000 UTC m=+1461.379433850" watchObservedRunningTime="2025-10-03 13:14:12.97646589 +0000 UTC m=+1461.380363725" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.010229 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-config-data" (OuterVolumeSpecName: "config-data") pod "7d947c9f-2cbc-428e-9464-ff2b560fe91c" (UID: "7d947c9f-2cbc-428e-9464-ff2b560fe91c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.057465 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.057510 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d947c9f-2cbc-428e-9464-ff2b560fe91c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.062622 4962 scope.go:117] "RemoveContainer" containerID="4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.088758 4962 scope.go:117] "RemoveContainer" containerID="5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.116283 4962 scope.go:117] "RemoveContainer" containerID="8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.135315 4962 scope.go:117] "RemoveContainer" containerID="401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b" Oct 03 13:14:13 crc kubenswrapper[4962]: E1003 13:14:13.135701 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b\": container with ID starting with 401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b not found: ID does not exist" containerID="401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.135743 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b"} err="failed to get container status \"401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b\": rpc error: code = NotFound desc = could not find container \"401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b\": container with ID starting with 401ac99d9fde3fe9f42593a4e4c45a252fcc44d73b6672127bacb6e49ba6089b not found: ID does not exist" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.135774 4962 scope.go:117] "RemoveContainer" containerID="4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01" Oct 03 13:14:13 crc kubenswrapper[4962]: E1003 13:14:13.136167 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01\": container with ID starting with 4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01 not found: ID does not exist" containerID="4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.136210 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01"} err="failed to get container status \"4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01\": rpc error: code = NotFound desc = could not find container \"4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01\": container with ID starting with 4bc12a8ce5d674870d7a9cecfd018f28f1578b48e4cd38d0fd8d23c91191ef01 not found: ID does not exist" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.136233 4962 scope.go:117] "RemoveContainer" containerID="5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85" Oct 03 13:14:13 crc kubenswrapper[4962]: E1003 13:14:13.136557 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85\": container with ID starting with 5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85 not found: ID does not exist" containerID="5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.136578 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85"} err="failed to get container status \"5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85\": rpc error: code = NotFound desc = could not find container \"5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85\": container with ID starting with 5a284e290c5063b0cd126fa801f84c0313614338f12abf7857a941fc01e24c85 not found: ID does not exist" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.136590 4962 scope.go:117] "RemoveContainer" containerID="8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8" Oct 03 13:14:13 crc kubenswrapper[4962]: E1003 13:14:13.138187 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8\": container with ID starting with 8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8 not found: ID does not exist" containerID="8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.138294 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8"} err="failed to get container status \"8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8\": rpc error: code = NotFound desc = could not find container \"8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8\": container with ID starting with 8553cc57ed01a9336dd8d6356f82c1369e896bc320f9f46fe4371766926005d8 not found: ID does not exist" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.296021 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.310687 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.320378 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:13 crc kubenswrapper[4962]: E1003 13:14:13.320807 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="sg-core" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.320824 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="sg-core" Oct 03 13:14:13 crc kubenswrapper[4962]: E1003 13:14:13.320842 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="ceilometer-notification-agent" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.320850 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="ceilometer-notification-agent" Oct 03 13:14:13 crc kubenswrapper[4962]: E1003 13:14:13.320864 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="ceilometer-central-agent" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.320872 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="ceilometer-central-agent" Oct 03 13:14:13 crc kubenswrapper[4962]: E1003 13:14:13.320888 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="proxy-httpd" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.320894 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="proxy-httpd" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.321058 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="ceilometer-central-agent" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.321080 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="ceilometer-notification-agent" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.321092 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="sg-core" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.321098 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" containerName="proxy-httpd" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.322701 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.325379 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.325981 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.329969 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.466588 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-log-httpd\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.466636 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxdz\" (UniqueName: \"kubernetes.io/projected/9b3333ff-a67c-47f6-8805-c8fca4490391-kube-api-access-jlxdz\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.466694 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.466743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-scripts\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.466771 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.466818 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-run-httpd\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.466842 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-config-data\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.568547 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-config-data\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.569069 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-log-httpd\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.569101 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxdz\" (UniqueName: \"kubernetes.io/projected/9b3333ff-a67c-47f6-8805-c8fca4490391-kube-api-access-jlxdz\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.569142 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.569192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-scripts\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.569221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.569273 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-run-httpd\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.570376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-log-httpd\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.570410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-run-httpd\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.573209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.574094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-config-data\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.574333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-scripts\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.574405 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.588084 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxdz\" (UniqueName: \"kubernetes.io/projected/9b3333ff-a67c-47f6-8805-c8fca4490391-kube-api-access-jlxdz\") pod \"ceilometer-0\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.638734 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.969990 4962 generic.go:334] "Generic (PLEG): container finished" podID="b60070ef-b284-466e-b596-9ea66a2b8896" containerID="b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05" exitCode=0 Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.970039 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz42z" event={"ID":"b60070ef-b284-466e-b596-9ea66a2b8896","Type":"ContainerDied","Data":"b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05"} Oct 03 13:14:13 crc kubenswrapper[4962]: I1003 13:14:13.970314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz42z" event={"ID":"b60070ef-b284-466e-b596-9ea66a2b8896","Type":"ContainerStarted","Data":"ded065bb223ef1e43793dd25e7d3a6e112a05eba87a19287fb95c67de9500154"} Oct 03 13:14:14 crc kubenswrapper[4962]: I1003 13:14:14.125231 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:14 crc kubenswrapper[4962]: I1003 13:14:14.265111 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d947c9f-2cbc-428e-9464-ff2b560fe91c" path="/var/lib/kubelet/pods/7d947c9f-2cbc-428e-9464-ff2b560fe91c/volumes" Oct 03 13:14:14 crc kubenswrapper[4962]: I1003 13:14:14.857675 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:14 crc kubenswrapper[4962]: I1003 13:14:14.983126 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz42z" event={"ID":"b60070ef-b284-466e-b596-9ea66a2b8896","Type":"ContainerStarted","Data":"e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea"} Oct 03 13:14:14 crc kubenswrapper[4962]: I1003 13:14:14.986978 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerStarted","Data":"afeae27701e2392705238b84ac6dd78fe0f664e305649d2d10e4ae08abff2124"} Oct 03 13:14:14 crc kubenswrapper[4962]: I1003 13:14:14.987029 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerStarted","Data":"d1e3e02355ee653e4b300355a19874e20536eb1c692817430098cef11a5a7920"} Oct 03 13:14:15 crc kubenswrapper[4962]: I1003 13:14:15.998574 4962 generic.go:334] "Generic (PLEG): container finished" podID="b60070ef-b284-466e-b596-9ea66a2b8896" containerID="e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea" exitCode=0 Oct 03 13:14:15 crc kubenswrapper[4962]: I1003 13:14:15.998670 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz42z" event={"ID":"b60070ef-b284-466e-b596-9ea66a2b8896","Type":"ContainerDied","Data":"e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea"} Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.003448 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerStarted","Data":"b7ac0b38f7434d87f494b7d7ef5ac14815e6dda4af34aee9ddc66049c6444e15"} Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.315070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.719947 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrtt5"] Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.721263 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.723650 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.723922 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-z92tv" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.726565 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.731358 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrtt5"] Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.836790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.836926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4vd8\" (UniqueName: \"kubernetes.io/projected/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-kube-api-access-z4vd8\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.836948 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-scripts\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.837132 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-config-data\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.938625 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.939654 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4vd8\" (UniqueName: \"kubernetes.io/projected/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-kube-api-access-z4vd8\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.939756 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-scripts\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.939900 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-config-data\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.947503 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-config-data\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.951078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-scripts\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.952237 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:16 crc kubenswrapper[4962]: I1003 13:14:16.960386 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4vd8\" (UniqueName: \"kubernetes.io/projected/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-kube-api-access-z4vd8\") pod \"nova-cell0-conductor-db-sync-mrtt5\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:17 crc kubenswrapper[4962]: I1003 13:14:17.023975 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerStarted","Data":"5a8e43544a97f6761681dbd26673869cd71dac61e354530df7eaa74858d31f96"} Oct 03 13:14:17 crc kubenswrapper[4962]: I1003 13:14:17.026829 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz42z" event={"ID":"b60070ef-b284-466e-b596-9ea66a2b8896","Type":"ContainerStarted","Data":"beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1"} Oct 03 13:14:17 crc kubenswrapper[4962]: I1003 13:14:17.035453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:17 crc kubenswrapper[4962]: I1003 13:14:17.052039 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bz42z" podStartSLOduration=3.5701860869999997 podStartE2EDuration="6.05201704s" podCreationTimestamp="2025-10-03 13:14:11 +0000 UTC" firstStartedPulling="2025-10-03 13:14:13.973098599 +0000 UTC m=+1462.376996454" lastFinishedPulling="2025-10-03 13:14:16.454929572 +0000 UTC m=+1464.858827407" observedRunningTime="2025-10-03 13:14:17.047698114 +0000 UTC m=+1465.451595949" watchObservedRunningTime="2025-10-03 13:14:17.05201704 +0000 UTC m=+1465.455914885" Oct 03 13:14:17 crc kubenswrapper[4962]: I1003 13:14:17.521856 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrtt5"] Oct 03 13:14:17 crc kubenswrapper[4962]: W1003 13:14:17.525752 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48bf959f_6a35_4204_bd4d_6e0a62a2a7db.slice/crio-b09acdbd8d77e3ae226872916bd1cc2f05c93ff72083dabb2bd42ae9868454b2 WatchSource:0}: Error finding container b09acdbd8d77e3ae226872916bd1cc2f05c93ff72083dabb2bd42ae9868454b2: Status 404 returned error can't find the container with id b09acdbd8d77e3ae226872916bd1cc2f05c93ff72083dabb2bd42ae9868454b2 Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.039812 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerStarted","Data":"356280c2756f819cd649a7676b0b8befdad38f7c7eda4ae9893947a5b73b5f11"} Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.040048 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="ceilometer-central-agent" containerID="cri-o://afeae27701e2392705238b84ac6dd78fe0f664e305649d2d10e4ae08abff2124" gracePeriod=30 Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.040217 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.040278 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="proxy-httpd" containerID="cri-o://356280c2756f819cd649a7676b0b8befdad38f7c7eda4ae9893947a5b73b5f11" gracePeriod=30 Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.040338 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="sg-core" containerID="cri-o://5a8e43544a97f6761681dbd26673869cd71dac61e354530df7eaa74858d31f96" gracePeriod=30 Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.040399 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="ceilometer-notification-agent" containerID="cri-o://b7ac0b38f7434d87f494b7d7ef5ac14815e6dda4af34aee9ddc66049c6444e15" gracePeriod=30 Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.041526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" event={"ID":"48bf959f-6a35-4204-bd4d-6e0a62a2a7db","Type":"ContainerStarted","Data":"b09acdbd8d77e3ae226872916bd1cc2f05c93ff72083dabb2bd42ae9868454b2"} Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.112232 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.956310589 podStartE2EDuration="5.112213886s" podCreationTimestamp="2025-10-03 13:14:13 +0000 UTC" firstStartedPulling="2025-10-03 13:14:14.12540311 +0000 UTC m=+1462.529300945" lastFinishedPulling="2025-10-03 13:14:17.281306407 +0000 UTC m=+1465.685204242" observedRunningTime="2025-10-03 13:14:18.057677981 +0000 UTC m=+1466.461575826" watchObservedRunningTime="2025-10-03 13:14:18.112213886 +0000 UTC m=+1466.516111721" Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.258755 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.259088 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.300871 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 13:14:18 crc kubenswrapper[4962]: I1003 13:14:18.314551 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 13:14:19 crc kubenswrapper[4962]: I1003 13:14:19.055281 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerID="356280c2756f819cd649a7676b0b8befdad38f7c7eda4ae9893947a5b73b5f11" exitCode=0 Oct 03 13:14:19 crc kubenswrapper[4962]: I1003 13:14:19.055319 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerID="5a8e43544a97f6761681dbd26673869cd71dac61e354530df7eaa74858d31f96" exitCode=2 Oct 03 13:14:19 crc kubenswrapper[4962]: I1003 13:14:19.055334 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerID="b7ac0b38f7434d87f494b7d7ef5ac14815e6dda4af34aee9ddc66049c6444e15" exitCode=0 Oct 03 13:14:19 crc kubenswrapper[4962]: I1003 13:14:19.056752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerDied","Data":"356280c2756f819cd649a7676b0b8befdad38f7c7eda4ae9893947a5b73b5f11"} Oct 03 13:14:19 crc kubenswrapper[4962]: I1003 13:14:19.056790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerDied","Data":"5a8e43544a97f6761681dbd26673869cd71dac61e354530df7eaa74858d31f96"} Oct 03 13:14:19 crc kubenswrapper[4962]: I1003 13:14:19.056820 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 13:14:19 crc kubenswrapper[4962]: I1003 13:14:19.056833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerDied","Data":"b7ac0b38f7434d87f494b7d7ef5ac14815e6dda4af34aee9ddc66049c6444e15"} Oct 03 13:14:19 crc kubenswrapper[4962]: I1003 13:14:19.056991 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 13:14:20 crc kubenswrapper[4962]: I1003 13:14:20.428368 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:20 crc kubenswrapper[4962]: I1003 13:14:20.428703 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:20 crc kubenswrapper[4962]: I1003 13:14:20.476315 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:20 crc kubenswrapper[4962]: I1003 13:14:20.480117 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:21 crc kubenswrapper[4962]: I1003 13:14:21.066294 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 13:14:21 crc kubenswrapper[4962]: I1003 13:14:21.068738 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 13:14:21 crc kubenswrapper[4962]: I1003 13:14:21.077392 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerID="afeae27701e2392705238b84ac6dd78fe0f664e305649d2d10e4ae08abff2124" exitCode=0 Oct 03 13:14:21 crc kubenswrapper[4962]: I1003 13:14:21.077468 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerDied","Data":"afeae27701e2392705238b84ac6dd78fe0f664e305649d2d10e4ae08abff2124"} Oct 03 13:14:21 crc kubenswrapper[4962]: I1003 13:14:21.077892 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:21 crc kubenswrapper[4962]: I1003 13:14:21.077921 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:22 crc kubenswrapper[4962]: E1003 13:14:22.295921 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9e6c89_714e_4efd_9adc_c15cd5b3eb6b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 13:14:22 crc kubenswrapper[4962]: I1003 13:14:22.314507 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:22 crc kubenswrapper[4962]: I1003 13:14:22.315091 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:22 crc kubenswrapper[4962]: I1003 13:14:22.380945 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:23 crc kubenswrapper[4962]: I1003 13:14:23.212113 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:23 crc kubenswrapper[4962]: I1003 13:14:23.305833 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bz42z"] Oct 03 13:14:23 crc kubenswrapper[4962]: I1003 13:14:23.430316 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:23 crc kubenswrapper[4962]: I1003 13:14:23.430425 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 13:14:23 crc kubenswrapper[4962]: I1003 13:14:23.433490 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 13:14:24 crc kubenswrapper[4962]: I1003 13:14:24.659685 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:14:24 crc kubenswrapper[4962]: I1003 13:14:24.659830 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.069211 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.111740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-run-httpd\") pod \"9b3333ff-a67c-47f6-8805-c8fca4490391\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.112899 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9b3333ff-a67c-47f6-8805-c8fca4490391" (UID: "9b3333ff-a67c-47f6-8805-c8fca4490391"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.113000 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-config-data\") pod \"9b3333ff-a67c-47f6-8805-c8fca4490391\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.113069 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-log-httpd\") pod \"9b3333ff-a67c-47f6-8805-c8fca4490391\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.113094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-sg-core-conf-yaml\") pod \"9b3333ff-a67c-47f6-8805-c8fca4490391\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.113132 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlxdz\" (UniqueName: \"kubernetes.io/projected/9b3333ff-a67c-47f6-8805-c8fca4490391-kube-api-access-jlxdz\") pod \"9b3333ff-a67c-47f6-8805-c8fca4490391\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.113164 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-scripts\") pod \"9b3333ff-a67c-47f6-8805-c8fca4490391\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.113212 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-combined-ca-bundle\") pod \"9b3333ff-a67c-47f6-8805-c8fca4490391\" (UID: \"9b3333ff-a67c-47f6-8805-c8fca4490391\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.113788 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.115264 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9b3333ff-a67c-47f6-8805-c8fca4490391" (UID: "9b3333ff-a67c-47f6-8805-c8fca4490391"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.118608 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3333ff-a67c-47f6-8805-c8fca4490391-kube-api-access-jlxdz" (OuterVolumeSpecName: "kube-api-access-jlxdz") pod "9b3333ff-a67c-47f6-8805-c8fca4490391" (UID: "9b3333ff-a67c-47f6-8805-c8fca4490391"). InnerVolumeSpecName "kube-api-access-jlxdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.124004 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-scripts" (OuterVolumeSpecName: "scripts") pod "9b3333ff-a67c-47f6-8805-c8fca4490391" (UID: "9b3333ff-a67c-47f6-8805-c8fca4490391"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.127360 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b3333ff-a67c-47f6-8805-c8fca4490391","Type":"ContainerDied","Data":"d1e3e02355ee653e4b300355a19874e20536eb1c692817430098cef11a5a7920"} Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.127412 4962 scope.go:117] "RemoveContainer" containerID="356280c2756f819cd649a7676b0b8befdad38f7c7eda4ae9893947a5b73b5f11" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.127419 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bz42z" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" containerName="registry-server" containerID="cri-o://beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1" gracePeriod=2 Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.127765 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.156693 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9b3333ff-a67c-47f6-8805-c8fca4490391" (UID: "9b3333ff-a67c-47f6-8805-c8fca4490391"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.200270 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b3333ff-a67c-47f6-8805-c8fca4490391" (UID: "9b3333ff-a67c-47f6-8805-c8fca4490391"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.215786 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.215815 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.215826 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b3333ff-a67c-47f6-8805-c8fca4490391-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.215835 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.215844 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlxdz\" (UniqueName: \"kubernetes.io/projected/9b3333ff-a67c-47f6-8805-c8fca4490391-kube-api-access-jlxdz\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.232568 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-config-data" (OuterVolumeSpecName: "config-data") pod "9b3333ff-a67c-47f6-8805-c8fca4490391" (UID: "9b3333ff-a67c-47f6-8805-c8fca4490391"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.233515 4962 scope.go:117] "RemoveContainer" containerID="5a8e43544a97f6761681dbd26673869cd71dac61e354530df7eaa74858d31f96" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.250481 4962 scope.go:117] "RemoveContainer" containerID="b7ac0b38f7434d87f494b7d7ef5ac14815e6dda4af34aee9ddc66049c6444e15" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.279778 4962 scope.go:117] "RemoveContainer" containerID="afeae27701e2392705238b84ac6dd78fe0f664e305649d2d10e4ae08abff2124" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.317458 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3333ff-a67c-47f6-8805-c8fca4490391-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.485855 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.498799 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.508380 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:25 crc kubenswrapper[4962]: E1003 13:14:25.508812 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="proxy-httpd" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.508832 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="proxy-httpd" Oct 03 13:14:25 crc kubenswrapper[4962]: E1003 13:14:25.508847 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="ceilometer-central-agent" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.508854 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="ceilometer-central-agent" Oct 03 13:14:25 crc kubenswrapper[4962]: E1003 13:14:25.508873 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="ceilometer-notification-agent" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.508879 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="ceilometer-notification-agent" Oct 03 13:14:25 crc kubenswrapper[4962]: E1003 13:14:25.508892 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="sg-core" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.508897 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="sg-core" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.509051 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="ceilometer-notification-agent" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.509062 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="ceilometer-central-agent" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.509080 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="proxy-httpd" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.509092 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" containerName="sg-core" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.511481 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.515150 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.516277 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.530093 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.563668 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.622868 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-utilities\") pod \"b60070ef-b284-466e-b596-9ea66a2b8896\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623015 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-catalog-content\") pod \"b60070ef-b284-466e-b596-9ea66a2b8896\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623059 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfvqr\" (UniqueName: \"kubernetes.io/projected/b60070ef-b284-466e-b596-9ea66a2b8896-kube-api-access-mfvqr\") pod \"b60070ef-b284-466e-b596-9ea66a2b8896\" (UID: \"b60070ef-b284-466e-b596-9ea66a2b8896\") " Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623320 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-run-httpd\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623344 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-config-data\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623371 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-scripts\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623444 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-log-httpd\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623464 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623482 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623497 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lgf\" (UniqueName: \"kubernetes.io/projected/907629fb-4025-4486-be7b-c511c22fc6c1-kube-api-access-w5lgf\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.623703 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-utilities" (OuterVolumeSpecName: "utilities") pod "b60070ef-b284-466e-b596-9ea66a2b8896" (UID: "b60070ef-b284-466e-b596-9ea66a2b8896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.627344 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60070ef-b284-466e-b596-9ea66a2b8896-kube-api-access-mfvqr" (OuterVolumeSpecName: "kube-api-access-mfvqr") pod "b60070ef-b284-466e-b596-9ea66a2b8896" (UID: "b60070ef-b284-466e-b596-9ea66a2b8896"). InnerVolumeSpecName "kube-api-access-mfvqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.670142 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b60070ef-b284-466e-b596-9ea66a2b8896" (UID: "b60070ef-b284-466e-b596-9ea66a2b8896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.724592 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-run-httpd\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.724664 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-config-data\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.724699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-scripts\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.724802 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-log-httpd\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.724829 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.724856 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.724880 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lgf\" (UniqueName: \"kubernetes.io/projected/907629fb-4025-4486-be7b-c511c22fc6c1-kube-api-access-w5lgf\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.725002 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfvqr\" (UniqueName: \"kubernetes.io/projected/b60070ef-b284-466e-b596-9ea66a2b8896-kube-api-access-mfvqr\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.725020 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.725033 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60070ef-b284-466e-b596-9ea66a2b8896-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.725251 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-log-httpd\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.725430 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-run-httpd\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.728096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-config-data\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.728249 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-scripts\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.729380 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.729490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.740989 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lgf\" (UniqueName: \"kubernetes.io/projected/907629fb-4025-4486-be7b-c511c22fc6c1-kube-api-access-w5lgf\") pod \"ceilometer-0\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " pod="openstack/ceilometer-0" Oct 03 13:14:25 crc kubenswrapper[4962]: I1003 13:14:25.839422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.137277 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" event={"ID":"48bf959f-6a35-4204-bd4d-6e0a62a2a7db","Type":"ContainerStarted","Data":"65b272d77ba32c51dd9d80e7600ed36904dbcfecda99b28940a7c118dbf10fee"} Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.140680 4962 generic.go:334] "Generic (PLEG): container finished" podID="b60070ef-b284-466e-b596-9ea66a2b8896" containerID="beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1" exitCode=0 Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.140729 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz42z" event={"ID":"b60070ef-b284-466e-b596-9ea66a2b8896","Type":"ContainerDied","Data":"beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1"} Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.140785 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bz42z" event={"ID":"b60070ef-b284-466e-b596-9ea66a2b8896","Type":"ContainerDied","Data":"ded065bb223ef1e43793dd25e7d3a6e112a05eba87a19287fb95c67de9500154"} Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.140809 4962 scope.go:117] "RemoveContainer" containerID="beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.140745 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bz42z" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.154677 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" podStartSLOduration=2.747069799 podStartE2EDuration="10.154620505s" podCreationTimestamp="2025-10-03 13:14:16 +0000 UTC" firstStartedPulling="2025-10-03 13:14:17.530235574 +0000 UTC m=+1465.934133409" lastFinishedPulling="2025-10-03 13:14:24.93778626 +0000 UTC m=+1473.341684115" observedRunningTime="2025-10-03 13:14:26.154149332 +0000 UTC m=+1474.558047167" watchObservedRunningTime="2025-10-03 13:14:26.154620505 +0000 UTC m=+1474.558518340" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.164593 4962 scope.go:117] "RemoveContainer" containerID="e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.192175 4962 scope.go:117] "RemoveContainer" containerID="b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.192820 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bz42z"] Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.207657 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bz42z"] Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.238236 4962 scope.go:117] "RemoveContainer" containerID="beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.238414 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3333ff-a67c-47f6-8805-c8fca4490391" path="/var/lib/kubelet/pods/9b3333ff-a67c-47f6-8805-c8fca4490391/volumes" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.239113 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" path="/var/lib/kubelet/pods/b60070ef-b284-466e-b596-9ea66a2b8896/volumes" Oct 03 13:14:26 crc kubenswrapper[4962]: E1003 13:14:26.239940 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1\": container with ID starting with beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1 not found: ID does not exist" containerID="beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.239969 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1"} err="failed to get container status \"beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1\": rpc error: code = NotFound desc = could not find container \"beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1\": container with ID starting with beb45caa0c59e894be38f7d7be2e77f4ab22727dc15cc852884e65cee70f25b1 not found: ID does not exist" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.239988 4962 scope.go:117] "RemoveContainer" containerID="e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea" Oct 03 13:14:26 crc kubenswrapper[4962]: E1003 13:14:26.240211 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea\": container with ID starting with e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea not found: ID does not exist" containerID="e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.240230 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea"} err="failed to get container status \"e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea\": rpc error: code = NotFound desc = could not find container \"e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea\": container with ID starting with e72edc8d2cf384cfe5aa782722765c7fcd18f7135e0164d96ea8a353982a07ea not found: ID does not exist" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.240245 4962 scope.go:117] "RemoveContainer" containerID="b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05" Oct 03 13:14:26 crc kubenswrapper[4962]: E1003 13:14:26.240512 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05\": container with ID starting with b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05 not found: ID does not exist" containerID="b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.240551 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05"} err="failed to get container status \"b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05\": rpc error: code = NotFound desc = could not find container \"b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05\": container with ID starting with b7f72a7b32061e066fb4208e5ec51ccf97b4710abbb277d6651bfe67548f4f05 not found: ID does not exist" Oct 03 13:14:26 crc kubenswrapper[4962]: I1003 13:14:26.288674 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:14:26 crc kubenswrapper[4962]: W1003 13:14:26.299180 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod907629fb_4025_4486_be7b_c511c22fc6c1.slice/crio-13fbdbe7f154d3138a3d07bf1a1c7880686ab6ec24311f10e66c82f6424f4ee2 WatchSource:0}: Error finding container 13fbdbe7f154d3138a3d07bf1a1c7880686ab6ec24311f10e66c82f6424f4ee2: Status 404 returned error can't find the container with id 13fbdbe7f154d3138a3d07bf1a1c7880686ab6ec24311f10e66c82f6424f4ee2 Oct 03 13:14:27 crc kubenswrapper[4962]: I1003 13:14:27.149862 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerStarted","Data":"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776"} Oct 03 13:14:27 crc kubenswrapper[4962]: I1003 13:14:27.150162 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerStarted","Data":"13fbdbe7f154d3138a3d07bf1a1c7880686ab6ec24311f10e66c82f6424f4ee2"} Oct 03 13:14:28 crc kubenswrapper[4962]: I1003 13:14:28.161977 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerStarted","Data":"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40"} Oct 03 13:14:28 crc kubenswrapper[4962]: I1003 13:14:28.162591 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerStarted","Data":"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3"} Oct 03 13:14:30 crc kubenswrapper[4962]: I1003 13:14:30.181055 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerStarted","Data":"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779"} Oct 03 13:14:30 crc kubenswrapper[4962]: I1003 13:14:30.181703 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 13:14:30 crc kubenswrapper[4962]: I1003 13:14:30.211518 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3003897 podStartE2EDuration="5.211488172s" podCreationTimestamp="2025-10-03 13:14:25 +0000 UTC" firstStartedPulling="2025-10-03 13:14:26.301400447 +0000 UTC m=+1474.705298282" lastFinishedPulling="2025-10-03 13:14:29.212498919 +0000 UTC m=+1477.616396754" observedRunningTime="2025-10-03 13:14:30.205723547 +0000 UTC m=+1478.609621432" watchObservedRunningTime="2025-10-03 13:14:30.211488172 +0000 UTC m=+1478.615386017" Oct 03 13:14:32 crc kubenswrapper[4962]: E1003 13:14:32.549464 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9e6c89_714e_4efd_9adc_c15cd5b3eb6b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 13:14:34 crc kubenswrapper[4962]: I1003 13:14:34.218959 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" event={"ID":"48bf959f-6a35-4204-bd4d-6e0a62a2a7db","Type":"ContainerDied","Data":"65b272d77ba32c51dd9d80e7600ed36904dbcfecda99b28940a7c118dbf10fee"} Oct 03 13:14:34 crc kubenswrapper[4962]: I1003 13:14:34.219032 4962 generic.go:334] "Generic (PLEG): container finished" podID="48bf959f-6a35-4204-bd4d-6e0a62a2a7db" containerID="65b272d77ba32c51dd9d80e7600ed36904dbcfecda99b28940a7c118dbf10fee" exitCode=0 Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.589173 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.749729 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4vd8\" (UniqueName: \"kubernetes.io/projected/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-kube-api-access-z4vd8\") pod \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.749882 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-scripts\") pod \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.750085 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-config-data\") pod \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.750193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-combined-ca-bundle\") pod \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\" (UID: \"48bf959f-6a35-4204-bd4d-6e0a62a2a7db\") " Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.757355 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-kube-api-access-z4vd8" (OuterVolumeSpecName: "kube-api-access-z4vd8") pod "48bf959f-6a35-4204-bd4d-6e0a62a2a7db" (UID: "48bf959f-6a35-4204-bd4d-6e0a62a2a7db"). InnerVolumeSpecName "kube-api-access-z4vd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.757860 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-scripts" (OuterVolumeSpecName: "scripts") pod "48bf959f-6a35-4204-bd4d-6e0a62a2a7db" (UID: "48bf959f-6a35-4204-bd4d-6e0a62a2a7db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.789303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48bf959f-6a35-4204-bd4d-6e0a62a2a7db" (UID: "48bf959f-6a35-4204-bd4d-6e0a62a2a7db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.809709 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-config-data" (OuterVolumeSpecName: "config-data") pod "48bf959f-6a35-4204-bd4d-6e0a62a2a7db" (UID: "48bf959f-6a35-4204-bd4d-6e0a62a2a7db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.852614 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.852693 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.852715 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:35 crc kubenswrapper[4962]: I1003 13:14:35.852737 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4vd8\" (UniqueName: \"kubernetes.io/projected/48bf959f-6a35-4204-bd4d-6e0a62a2a7db-kube-api-access-z4vd8\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.236447 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.237917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrtt5" event={"ID":"48bf959f-6a35-4204-bd4d-6e0a62a2a7db","Type":"ContainerDied","Data":"b09acdbd8d77e3ae226872916bd1cc2f05c93ff72083dabb2bd42ae9868454b2"} Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.237966 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b09acdbd8d77e3ae226872916bd1cc2f05c93ff72083dabb2bd42ae9868454b2" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.320789 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 13:14:36 crc kubenswrapper[4962]: E1003 13:14:36.321172 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" containerName="extract-content" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.321189 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" containerName="extract-content" Oct 03 13:14:36 crc kubenswrapper[4962]: E1003 13:14:36.321207 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bf959f-6a35-4204-bd4d-6e0a62a2a7db" containerName="nova-cell0-conductor-db-sync" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.321213 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bf959f-6a35-4204-bd4d-6e0a62a2a7db" containerName="nova-cell0-conductor-db-sync" Oct 03 13:14:36 crc kubenswrapper[4962]: E1003 13:14:36.321227 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" containerName="registry-server" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.321238 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" containerName="registry-server" Oct 03 13:14:36 crc kubenswrapper[4962]: E1003 13:14:36.321263 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" containerName="extract-utilities" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.321269 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" containerName="extract-utilities" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.321446 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bf959f-6a35-4204-bd4d-6e0a62a2a7db" containerName="nova-cell0-conductor-db-sync" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.321485 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60070ef-b284-466e-b596-9ea66a2b8896" containerName="registry-server" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.322106 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.324003 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-z92tv" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.324771 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.330619 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.468035 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hplr\" (UniqueName: \"kubernetes.io/projected/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-kube-api-access-4hplr\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.468089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.468265 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.570318 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.570480 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.570520 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hplr\" (UniqueName: \"kubernetes.io/projected/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-kube-api-access-4hplr\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.577578 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.582134 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.592198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hplr\" (UniqueName: \"kubernetes.io/projected/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-kube-api-access-4hplr\") pod \"nova-cell0-conductor-0\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:36 crc kubenswrapper[4962]: I1003 13:14:36.675894 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:37 crc kubenswrapper[4962]: I1003 13:14:37.133812 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 13:14:37 crc kubenswrapper[4962]: I1003 13:14:37.245451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"85ea0653-966b-47ff-b8aa-b6ad2b5810ca","Type":"ContainerStarted","Data":"fb0d919568b2eec75f54551c357ef63bb039e62f2156dd76893d99b06463a883"} Oct 03 13:14:38 crc kubenswrapper[4962]: I1003 13:14:38.261471 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"85ea0653-966b-47ff-b8aa-b6ad2b5810ca","Type":"ContainerStarted","Data":"c9c8ed01d13ca0f8ff902a9439408e51baa90fc268f710fa26c3c09fc4aeec3c"} Oct 03 13:14:38 crc kubenswrapper[4962]: I1003 13:14:38.261929 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:42 crc kubenswrapper[4962]: E1003 13:14:42.782956 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9e6c89_714e_4efd_9adc_c15cd5b3eb6b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 13:14:46 crc kubenswrapper[4962]: I1003 13:14:46.715664 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 13:14:46 crc kubenswrapper[4962]: I1003 13:14:46.742086 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.742068322 podStartE2EDuration="10.742068322s" podCreationTimestamp="2025-10-03 13:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:38.286752684 +0000 UTC m=+1486.690650529" watchObservedRunningTime="2025-10-03 13:14:46.742068322 +0000 UTC m=+1495.145966157" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.188349 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-c6gvp"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.189444 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.191618 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.191856 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.212542 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-c6gvp"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.260053 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-config-data\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.260126 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-scripts\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.260161 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.260426 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkpv\" (UniqueName: \"kubernetes.io/projected/7768c654-4c5b-44f0-944d-4c4507a252b3-kube-api-access-slkpv\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.349782 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.351862 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.352035 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.353160 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.355453 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.362230 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.362299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkpv\" (UniqueName: \"kubernetes.io/projected/7768c654-4c5b-44f0-944d-4c4507a252b3-kube-api-access-slkpv\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.362370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-config-data\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.362412 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-scripts\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.370592 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.371823 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.372422 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.380521 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-config-data\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.384205 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-scripts\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.392832 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.407391 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkpv\" (UniqueName: \"kubernetes.io/projected/7768c654-4c5b-44f0-944d-4c4507a252b3-kube-api-access-slkpv\") pod \"nova-cell0-cell-mapping-c6gvp\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.433173 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.434737 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.439732 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.468223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.468430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.468879 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x7mg\" (UniqueName: \"kubernetes.io/projected/0863fcf4-ac99-4685-a482-08e2a3409f4d-kube-api-access-8x7mg\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.469023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-config-data\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.469140 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0863fcf4-ac99-4685-a482-08e2a3409f4d-logs\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.469345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-config-data\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.469459 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwtpl\" (UniqueName: \"kubernetes.io/projected/8199ec71-3fba-4ae9-9b74-6e41edd368aa-kube-api-access-rwtpl\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.492879 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.520296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.550596 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.552288 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.554721 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.565989 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573566 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwtpl\" (UniqueName: \"kubernetes.io/projected/8199ec71-3fba-4ae9-9b74-6e41edd368aa-kube-api-access-rwtpl\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573657 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573682 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2pmn\" (UniqueName: \"kubernetes.io/projected/6570f82b-7149-41ef-8bb6-aea99de50975-kube-api-access-g2pmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573834 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x7mg\" (UniqueName: \"kubernetes.io/projected/0863fcf4-ac99-4685-a482-08e2a3409f4d-kube-api-access-8x7mg\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-config-data\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573876 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0863fcf4-ac99-4685-a482-08e2a3409f4d-logs\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.573910 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-config-data\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.576617 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0863fcf4-ac99-4685-a482-08e2a3409f4d-logs\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.578082 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.579148 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.589862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-config-data\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.591882 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-config-data\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.598438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwtpl\" (UniqueName: \"kubernetes.io/projected/8199ec71-3fba-4ae9-9b74-6e41edd368aa-kube-api-access-rwtpl\") pod \"nova-scheduler-0\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.613205 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x7mg\" (UniqueName: \"kubernetes.io/projected/0863fcf4-ac99-4685-a482-08e2a3409f4d-kube-api-access-8x7mg\") pod \"nova-api-0\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.646422 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-rq9zm"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.650779 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.663526 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-rq9zm"] Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.675070 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3b21d4-fd07-4c20-8004-6271bae734d6-logs\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.675342 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l46s\" (UniqueName: \"kubernetes.io/projected/6a3b21d4-fd07-4c20-8004-6271bae734d6-kube-api-access-6l46s\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.675367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.675396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.675421 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.675482 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-config-data\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.675501 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2pmn\" (UniqueName: \"kubernetes.io/projected/6570f82b-7149-41ef-8bb6-aea99de50975-kube-api-access-g2pmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.682343 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.696201 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.702840 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2pmn\" (UniqueName: \"kubernetes.io/projected/6570f82b-7149-41ef-8bb6-aea99de50975-kube-api-access-g2pmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.759898 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.773353 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.776735 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.776803 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.776839 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-config-data\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.776892 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.777515 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.777546 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-config\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.777567 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3b21d4-fd07-4c20-8004-6271bae734d6-logs\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.777600 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.777626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l46s\" (UniqueName: \"kubernetes.io/projected/6a3b21d4-fd07-4c20-8004-6271bae734d6-kube-api-access-6l46s\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.777677 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmw9l\" (UniqueName: \"kubernetes.io/projected/0260ca36-4b03-4dfc-b212-39121ca7ceb1-kube-api-access-lmw9l\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.784022 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3b21d4-fd07-4c20-8004-6271bae734d6-logs\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.786244 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.786492 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-config-data\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.798442 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.819224 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l46s\" (UniqueName: \"kubernetes.io/projected/6a3b21d4-fd07-4c20-8004-6271bae734d6-kube-api-access-6l46s\") pod \"nova-metadata-0\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " pod="openstack/nova-metadata-0" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.879620 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.879754 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.879802 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-config\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.879849 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.879899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmw9l\" (UniqueName: \"kubernetes.io/projected/0260ca36-4b03-4dfc-b212-39121ca7ceb1-kube-api-access-lmw9l\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.879989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.880970 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-config\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.881019 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.881191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.881872 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.881952 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:47 crc kubenswrapper[4962]: I1003 13:14:47.901703 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmw9l\" (UniqueName: \"kubernetes.io/projected/0260ca36-4b03-4dfc-b212-39121ca7ceb1-kube-api-access-lmw9l\") pod \"dnsmasq-dns-845d6d6f59-rq9zm\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.025614 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.034958 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.105104 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-c6gvp"] Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.222768 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkw44"] Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.223995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.228912 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.229041 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.282775 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkw44"] Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.291693 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-scripts\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.291857 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xlm\" (UniqueName: \"kubernetes.io/projected/67400081-5ed3-48dc-be64-b3cf19bcf3c4-kube-api-access-k9xlm\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.291925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.291955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-config-data\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.383119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c6gvp" event={"ID":"7768c654-4c5b-44f0-944d-4c4507a252b3","Type":"ContainerStarted","Data":"71ac8cc085f4132583c17249b9f84cc4f8703d1f85ce678333204137c65ea8bc"} Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.395332 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xlm\" (UniqueName: \"kubernetes.io/projected/67400081-5ed3-48dc-be64-b3cf19bcf3c4-kube-api-access-k9xlm\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.395404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.395436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-config-data\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.395506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-scripts\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: W1003 13:14:48.402370 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0863fcf4_ac99_4685_a482_08e2a3409f4d.slice/crio-ecceab8ccb4ab53fde48a94a89980965537c7e79234c7918613201b637413bc0 WatchSource:0}: Error finding container ecceab8ccb4ab53fde48a94a89980965537c7e79234c7918613201b637413bc0: Status 404 returned error can't find the container with id ecceab8ccb4ab53fde48a94a89980965537c7e79234c7918613201b637413bc0 Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.418076 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.424090 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-scripts\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.424469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-config-data\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.425084 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.440034 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xlm\" (UniqueName: \"kubernetes.io/projected/67400081-5ed3-48dc-be64-b3cf19bcf3c4-kube-api-access-k9xlm\") pod \"nova-cell1-conductor-db-sync-tkw44\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.447565 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.560594 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.658813 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.728956 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:48 crc kubenswrapper[4962]: W1003 13:14:48.739898 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a3b21d4_fd07_4c20_8004_6271bae734d6.slice/crio-082f028347f55fa958bb1f2ef5b937a07867f9e1bf255d2d26d20717871d5286 WatchSource:0}: Error finding container 082f028347f55fa958bb1f2ef5b937a07867f9e1bf255d2d26d20717871d5286: Status 404 returned error can't find the container with id 082f028347f55fa958bb1f2ef5b937a07867f9e1bf255d2d26d20717871d5286 Oct 03 13:14:48 crc kubenswrapper[4962]: I1003 13:14:48.811033 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-rq9zm"] Oct 03 13:14:48 crc kubenswrapper[4962]: W1003 13:14:48.822261 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0260ca36_4b03_4dfc_b212_39121ca7ceb1.slice/crio-6a644532d69307b1e7fa8cd5fb3f60c3e298f374984f572787aaf71f631361db WatchSource:0}: Error finding container 6a644532d69307b1e7fa8cd5fb3f60c3e298f374984f572787aaf71f631361db: Status 404 returned error can't find the container with id 6a644532d69307b1e7fa8cd5fb3f60c3e298f374984f572787aaf71f631361db Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.104317 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkw44"] Oct 03 13:14:49 crc kubenswrapper[4962]: W1003 13:14:49.122152 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67400081_5ed3_48dc_be64_b3cf19bcf3c4.slice/crio-587d624057241591ff6368b0c2e78f5d5a1358265413d54a89ed1db573030ed1 WatchSource:0}: Error finding container 587d624057241591ff6368b0c2e78f5d5a1358265413d54a89ed1db573030ed1: Status 404 returned error can't find the container with id 587d624057241591ff6368b0c2e78f5d5a1358265413d54a89ed1db573030ed1 Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.407491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6570f82b-7149-41ef-8bb6-aea99de50975","Type":"ContainerStarted","Data":"87cf1dafe8eb7ef84ea4d306941fe1f2f211b222c414d94c9c71eace20337ab8"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.410645 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a3b21d4-fd07-4c20-8004-6271bae734d6","Type":"ContainerStarted","Data":"082f028347f55fa958bb1f2ef5b937a07867f9e1bf255d2d26d20717871d5286"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.413929 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0863fcf4-ac99-4685-a482-08e2a3409f4d","Type":"ContainerStarted","Data":"ecceab8ccb4ab53fde48a94a89980965537c7e79234c7918613201b637413bc0"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.415426 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8199ec71-3fba-4ae9-9b74-6e41edd368aa","Type":"ContainerStarted","Data":"d68a03ee6dd9708dd198083e44f6720a95fd141deabd0bd6b3aef796f6834b59"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.417859 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkw44" event={"ID":"67400081-5ed3-48dc-be64-b3cf19bcf3c4","Type":"ContainerStarted","Data":"0135cfd571ce2874c84e154f88d2566b922232ee0a4046bc559e2dcc8e3cb67b"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.417895 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkw44" event={"ID":"67400081-5ed3-48dc-be64-b3cf19bcf3c4","Type":"ContainerStarted","Data":"587d624057241591ff6368b0c2e78f5d5a1358265413d54a89ed1db573030ed1"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.433773 4962 generic.go:334] "Generic (PLEG): container finished" podID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerID="d1b68cda7d5a20cfec8f59c429355eeb96cc67c121429fe51cbdf95084657390" exitCode=0 Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.434360 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" event={"ID":"0260ca36-4b03-4dfc-b212-39121ca7ceb1","Type":"ContainerDied","Data":"d1b68cda7d5a20cfec8f59c429355eeb96cc67c121429fe51cbdf95084657390"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.434783 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" event={"ID":"0260ca36-4b03-4dfc-b212-39121ca7ceb1","Type":"ContainerStarted","Data":"6a644532d69307b1e7fa8cd5fb3f60c3e298f374984f572787aaf71f631361db"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.450852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c6gvp" event={"ID":"7768c654-4c5b-44f0-944d-4c4507a252b3","Type":"ContainerStarted","Data":"289ae55c3810fc3fdbc0b8d2aaf50cec4df133730555ad763e84eaf124304306"} Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.457979 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tkw44" podStartSLOduration=1.457955641 podStartE2EDuration="1.457955641s" podCreationTimestamp="2025-10-03 13:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:49.451482437 +0000 UTC m=+1497.855380272" watchObservedRunningTime="2025-10-03 13:14:49.457955641 +0000 UTC m=+1497.861853476" Oct 03 13:14:49 crc kubenswrapper[4962]: I1003 13:14:49.483400 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-c6gvp" podStartSLOduration=2.483364153 podStartE2EDuration="2.483364153s" podCreationTimestamp="2025-10-03 13:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:49.474966308 +0000 UTC m=+1497.878864143" watchObservedRunningTime="2025-10-03 13:14:49.483364153 +0000 UTC m=+1497.887261978" Oct 03 13:14:50 crc kubenswrapper[4962]: I1003 13:14:50.464180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" event={"ID":"0260ca36-4b03-4dfc-b212-39121ca7ceb1","Type":"ContainerStarted","Data":"e13e76ed7b2b5957c583bc0dec45a956affda6e0e4eec93c073dc4e352bb1689"} Oct 03 13:14:50 crc kubenswrapper[4962]: I1003 13:14:50.465018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:50 crc kubenswrapper[4962]: I1003 13:14:50.499648 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" podStartSLOduration=3.499610621 podStartE2EDuration="3.499610621s" podCreationTimestamp="2025-10-03 13:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:50.492946942 +0000 UTC m=+1498.896844787" watchObservedRunningTime="2025-10-03 13:14:50.499610621 +0000 UTC m=+1498.903508456" Oct 03 13:14:51 crc kubenswrapper[4962]: I1003 13:14:51.084043 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:51 crc kubenswrapper[4962]: I1003 13:14:51.097272 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.483393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6570f82b-7149-41ef-8bb6-aea99de50975","Type":"ContainerStarted","Data":"d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018"} Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.483715 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6570f82b-7149-41ef-8bb6-aea99de50975" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018" gracePeriod=30 Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.487058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a3b21d4-fd07-4c20-8004-6271bae734d6","Type":"ContainerStarted","Data":"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2"} Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.487124 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a3b21d4-fd07-4c20-8004-6271bae734d6","Type":"ContainerStarted","Data":"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101"} Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.487309 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerName="nova-metadata-log" containerID="cri-o://89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101" gracePeriod=30 Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.487472 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerName="nova-metadata-metadata" containerID="cri-o://4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2" gracePeriod=30 Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.489962 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0863fcf4-ac99-4685-a482-08e2a3409f4d","Type":"ContainerStarted","Data":"44155ad59a3bdf5bb854fbb4c0744bb3fd660fd870b7e075d9e28970f5dfa273"} Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.490002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0863fcf4-ac99-4685-a482-08e2a3409f4d","Type":"ContainerStarted","Data":"4d07108b0f72ed3f3f47d8bf2c579a7794c61bf89739b389232682126f154d11"} Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.495064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8199ec71-3fba-4ae9-9b74-6e41edd368aa","Type":"ContainerStarted","Data":"0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef"} Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.509722 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.775523552 podStartE2EDuration="5.509703763s" podCreationTimestamp="2025-10-03 13:14:47 +0000 UTC" firstStartedPulling="2025-10-03 13:14:48.556823797 +0000 UTC m=+1496.960721622" lastFinishedPulling="2025-10-03 13:14:51.291003988 +0000 UTC m=+1499.694901833" observedRunningTime="2025-10-03 13:14:52.508194333 +0000 UTC m=+1500.912092158" watchObservedRunningTime="2025-10-03 13:14:52.509703763 +0000 UTC m=+1500.913601598" Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.526306 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.685658118 podStartE2EDuration="5.526290689s" podCreationTimestamp="2025-10-03 13:14:47 +0000 UTC" firstStartedPulling="2025-10-03 13:14:48.42774442 +0000 UTC m=+1496.831642255" lastFinishedPulling="2025-10-03 13:14:51.268376991 +0000 UTC m=+1499.672274826" observedRunningTime="2025-10-03 13:14:52.522667001 +0000 UTC m=+1500.926564856" watchObservedRunningTime="2025-10-03 13:14:52.526290689 +0000 UTC m=+1500.930188524" Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.545179 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9916708979999997 podStartE2EDuration="5.545164596s" podCreationTimestamp="2025-10-03 13:14:47 +0000 UTC" firstStartedPulling="2025-10-03 13:14:48.741339983 +0000 UTC m=+1497.145237818" lastFinishedPulling="2025-10-03 13:14:51.294833671 +0000 UTC m=+1499.698731516" observedRunningTime="2025-10-03 13:14:52.538758573 +0000 UTC m=+1500.942656428" watchObservedRunningTime="2025-10-03 13:14:52.545164596 +0000 UTC m=+1500.949062421" Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.557178 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.688936757 podStartE2EDuration="5.557162328s" podCreationTimestamp="2025-10-03 13:14:47 +0000 UTC" firstStartedPulling="2025-10-03 13:14:48.403858919 +0000 UTC m=+1496.807756754" lastFinishedPulling="2025-10-03 13:14:51.27208449 +0000 UTC m=+1499.675982325" observedRunningTime="2025-10-03 13:14:52.55240047 +0000 UTC m=+1500.956298305" watchObservedRunningTime="2025-10-03 13:14:52.557162328 +0000 UTC m=+1500.961060163" Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.774441 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 13:14:52 crc kubenswrapper[4962]: I1003 13:14:52.787713 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.026142 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.026191 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.124562 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.220595 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-config-data\") pod \"6a3b21d4-fd07-4c20-8004-6271bae734d6\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.220665 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3b21d4-fd07-4c20-8004-6271bae734d6-logs\") pod \"6a3b21d4-fd07-4c20-8004-6271bae734d6\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.220918 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-combined-ca-bundle\") pod \"6a3b21d4-fd07-4c20-8004-6271bae734d6\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.221064 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l46s\" (UniqueName: \"kubernetes.io/projected/6a3b21d4-fd07-4c20-8004-6271bae734d6-kube-api-access-6l46s\") pod \"6a3b21d4-fd07-4c20-8004-6271bae734d6\" (UID: \"6a3b21d4-fd07-4c20-8004-6271bae734d6\") " Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.221305 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3b21d4-fd07-4c20-8004-6271bae734d6-logs" (OuterVolumeSpecName: "logs") pod "6a3b21d4-fd07-4c20-8004-6271bae734d6" (UID: "6a3b21d4-fd07-4c20-8004-6271bae734d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.221573 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a3b21d4-fd07-4c20-8004-6271bae734d6-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.225304 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3b21d4-fd07-4c20-8004-6271bae734d6-kube-api-access-6l46s" (OuterVolumeSpecName: "kube-api-access-6l46s") pod "6a3b21d4-fd07-4c20-8004-6271bae734d6" (UID: "6a3b21d4-fd07-4c20-8004-6271bae734d6"). InnerVolumeSpecName "kube-api-access-6l46s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.245744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a3b21d4-fd07-4c20-8004-6271bae734d6" (UID: "6a3b21d4-fd07-4c20-8004-6271bae734d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.246759 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-config-data" (OuterVolumeSpecName: "config-data") pod "6a3b21d4-fd07-4c20-8004-6271bae734d6" (UID: "6a3b21d4-fd07-4c20-8004-6271bae734d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.323507 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.323554 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l46s\" (UniqueName: \"kubernetes.io/projected/6a3b21d4-fd07-4c20-8004-6271bae734d6-kube-api-access-6l46s\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.323566 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3b21d4-fd07-4c20-8004-6271bae734d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.506790 4962 generic.go:334] "Generic (PLEG): container finished" podID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerID="4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2" exitCode=0 Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.506828 4962 generic.go:334] "Generic (PLEG): container finished" podID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerID="89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101" exitCode=143 Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.506866 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.506895 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a3b21d4-fd07-4c20-8004-6271bae734d6","Type":"ContainerDied","Data":"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2"} Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.506941 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a3b21d4-fd07-4c20-8004-6271bae734d6","Type":"ContainerDied","Data":"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101"} Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.506955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a3b21d4-fd07-4c20-8004-6271bae734d6","Type":"ContainerDied","Data":"082f028347f55fa958bb1f2ef5b937a07867f9e1bf255d2d26d20717871d5286"} Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.506972 4962 scope.go:117] "RemoveContainer" containerID="4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.543017 4962 scope.go:117] "RemoveContainer" containerID="89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.554725 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.576062 4962 scope.go:117] "RemoveContainer" containerID="4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2" Oct 03 13:14:53 crc kubenswrapper[4962]: E1003 13:14:53.576890 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2\": container with ID starting with 4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2 not found: ID does not exist" containerID="4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.576921 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2"} err="failed to get container status \"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2\": rpc error: code = NotFound desc = could not find container \"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2\": container with ID starting with 4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2 not found: ID does not exist" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.576941 4962 scope.go:117] "RemoveContainer" containerID="89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101" Oct 03 13:14:53 crc kubenswrapper[4962]: E1003 13:14:53.577268 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101\": container with ID starting with 89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101 not found: ID does not exist" containerID="89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.577293 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101"} err="failed to get container status \"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101\": rpc error: code = NotFound desc = could not find container \"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101\": container with ID starting with 89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101 not found: ID does not exist" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.577311 4962 scope.go:117] "RemoveContainer" containerID="4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.577568 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2"} err="failed to get container status \"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2\": rpc error: code = NotFound desc = could not find container \"4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2\": container with ID starting with 4fc008478a21e51efcd772d61178aba3de9d30470c7e4a46707095cc1e70f3d2 not found: ID does not exist" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.577616 4962 scope.go:117] "RemoveContainer" containerID="89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.577893 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101"} err="failed to get container status \"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101\": rpc error: code = NotFound desc = could not find container \"89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101\": container with ID starting with 89e74aede1b966c44015404bf89779791e340a93bbd902f592ff52b3fc627101 not found: ID does not exist" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.578554 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.594305 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:53 crc kubenswrapper[4962]: E1003 13:14:53.594747 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerName="nova-metadata-metadata" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.594768 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerName="nova-metadata-metadata" Oct 03 13:14:53 crc kubenswrapper[4962]: E1003 13:14:53.594799 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerName="nova-metadata-log" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.594806 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerName="nova-metadata-log" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.594983 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerName="nova-metadata-log" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.595017 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" containerName="nova-metadata-metadata" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.596068 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.600726 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.604080 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.604120 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.730644 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-logs\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.731023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht9wh\" (UniqueName: \"kubernetes.io/projected/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-kube-api-access-ht9wh\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.731103 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.731129 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.731279 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-config-data\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.833338 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-config-data\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.833409 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-logs\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.833457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht9wh\" (UniqueName: \"kubernetes.io/projected/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-kube-api-access-ht9wh\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.833490 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.833524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.834261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-logs\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.837315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-config-data\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.838110 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.839066 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.862579 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht9wh\" (UniqueName: \"kubernetes.io/projected/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-kube-api-access-ht9wh\") pod \"nova-metadata-0\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " pod="openstack/nova-metadata-0" Oct 03 13:14:53 crc kubenswrapper[4962]: I1003 13:14:53.919020 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:14:54 crc kubenswrapper[4962]: I1003 13:14:54.249773 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3b21d4-fd07-4c20-8004-6271bae734d6" path="/var/lib/kubelet/pods/6a3b21d4-fd07-4c20-8004-6271bae734d6/volumes" Oct 03 13:14:54 crc kubenswrapper[4962]: I1003 13:14:54.361289 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:54 crc kubenswrapper[4962]: W1003 13:14:54.371153 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2efa1ef6_5df5_4f5c_8a9d_e32460f07a7d.slice/crio-25f20a693b00f3a36989991c1e280d0ed2b975f52ab55938671e5219b97a3a15 WatchSource:0}: Error finding container 25f20a693b00f3a36989991c1e280d0ed2b975f52ab55938671e5219b97a3a15: Status 404 returned error can't find the container with id 25f20a693b00f3a36989991c1e280d0ed2b975f52ab55938671e5219b97a3a15 Oct 03 13:14:54 crc kubenswrapper[4962]: I1003 13:14:54.514695 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d","Type":"ContainerStarted","Data":"25f20a693b00f3a36989991c1e280d0ed2b975f52ab55938671e5219b97a3a15"} Oct 03 13:14:54 crc kubenswrapper[4962]: I1003 13:14:54.660127 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:14:54 crc kubenswrapper[4962]: I1003 13:14:54.660194 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:14:54 crc kubenswrapper[4962]: I1003 13:14:54.660250 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:14:54 crc kubenswrapper[4962]: I1003 13:14:54.660987 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:14:54 crc kubenswrapper[4962]: I1003 13:14:54.661040 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" gracePeriod=600 Oct 03 13:14:54 crc kubenswrapper[4962]: E1003 13:14:54.798511 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:14:55 crc kubenswrapper[4962]: I1003 13:14:55.528271 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d","Type":"ContainerStarted","Data":"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57"} Oct 03 13:14:55 crc kubenswrapper[4962]: I1003 13:14:55.528708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d","Type":"ContainerStarted","Data":"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894"} Oct 03 13:14:55 crc kubenswrapper[4962]: I1003 13:14:55.532365 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" exitCode=0 Oct 03 13:14:55 crc kubenswrapper[4962]: I1003 13:14:55.532403 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c"} Oct 03 13:14:55 crc kubenswrapper[4962]: I1003 13:14:55.532460 4962 scope.go:117] "RemoveContainer" containerID="ca8ebb170cb5bf8155325e7ea7c1aa3487bc412d9472e208bb48495e13806d06" Oct 03 13:14:55 crc kubenswrapper[4962]: I1003 13:14:55.533201 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:14:55 crc kubenswrapper[4962]: E1003 13:14:55.533530 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:14:55 crc kubenswrapper[4962]: I1003 13:14:55.565935 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.565912372 podStartE2EDuration="2.565912372s" podCreationTimestamp="2025-10-03 13:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:14:55.557539518 +0000 UTC m=+1503.961437373" watchObservedRunningTime="2025-10-03 13:14:55.565912372 +0000 UTC m=+1503.969810217" Oct 03 13:14:55 crc kubenswrapper[4962]: I1003 13:14:55.845026 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 13:14:56 crc kubenswrapper[4962]: I1003 13:14:56.542194 4962 generic.go:334] "Generic (PLEG): container finished" podID="7768c654-4c5b-44f0-944d-4c4507a252b3" containerID="289ae55c3810fc3fdbc0b8d2aaf50cec4df133730555ad763e84eaf124304306" exitCode=0 Oct 03 13:14:56 crc kubenswrapper[4962]: I1003 13:14:56.542526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c6gvp" event={"ID":"7768c654-4c5b-44f0-944d-4c4507a252b3","Type":"ContainerDied","Data":"289ae55c3810fc3fdbc0b8d2aaf50cec4df133730555ad763e84eaf124304306"} Oct 03 13:14:56 crc kubenswrapper[4962]: I1003 13:14:56.547127 4962 generic.go:334] "Generic (PLEG): container finished" podID="67400081-5ed3-48dc-be64-b3cf19bcf3c4" containerID="0135cfd571ce2874c84e154f88d2566b922232ee0a4046bc559e2dcc8e3cb67b" exitCode=0 Oct 03 13:14:56 crc kubenswrapper[4962]: I1003 13:14:56.547337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkw44" event={"ID":"67400081-5ed3-48dc-be64-b3cf19bcf3c4","Type":"ContainerDied","Data":"0135cfd571ce2874c84e154f88d2566b922232ee0a4046bc559e2dcc8e3cb67b"} Oct 03 13:14:57 crc kubenswrapper[4962]: I1003 13:14:57.760377 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 13:14:57 crc kubenswrapper[4962]: I1003 13:14:57.760793 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 13:14:57 crc kubenswrapper[4962]: I1003 13:14:57.773588 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 13:14:57 crc kubenswrapper[4962]: I1003 13:14:57.808940 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.037860 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.072130 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.085101 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.116884 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tgc99"] Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.117172 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" podUID="143264d1-7bcc-47d8-aa73-047017954ff4" containerName="dnsmasq-dns" containerID="cri-o://11088df8f83d28c23c00a477e48a59fc59a6392d7d1dfee3bdd6b3d0b252f17d" gracePeriod=10 Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.232892 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-config-data\") pod \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.232994 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-config-data\") pod \"7768c654-4c5b-44f0-944d-4c4507a252b3\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.233036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-combined-ca-bundle\") pod \"7768c654-4c5b-44f0-944d-4c4507a252b3\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.233155 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9xlm\" (UniqueName: \"kubernetes.io/projected/67400081-5ed3-48dc-be64-b3cf19bcf3c4-kube-api-access-k9xlm\") pod \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.233182 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-scripts\") pod \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.233259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-combined-ca-bundle\") pod \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\" (UID: \"67400081-5ed3-48dc-be64-b3cf19bcf3c4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.233302 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slkpv\" (UniqueName: \"kubernetes.io/projected/7768c654-4c5b-44f0-944d-4c4507a252b3-kube-api-access-slkpv\") pod \"7768c654-4c5b-44f0-944d-4c4507a252b3\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.233355 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-scripts\") pod \"7768c654-4c5b-44f0-944d-4c4507a252b3\" (UID: \"7768c654-4c5b-44f0-944d-4c4507a252b3\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.246995 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67400081-5ed3-48dc-be64-b3cf19bcf3c4-kube-api-access-k9xlm" (OuterVolumeSpecName: "kube-api-access-k9xlm") pod "67400081-5ed3-48dc-be64-b3cf19bcf3c4" (UID: "67400081-5ed3-48dc-be64-b3cf19bcf3c4"). InnerVolumeSpecName "kube-api-access-k9xlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.271871 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-scripts" (OuterVolumeSpecName: "scripts") pod "7768c654-4c5b-44f0-944d-4c4507a252b3" (UID: "7768c654-4c5b-44f0-944d-4c4507a252b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.276079 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-scripts" (OuterVolumeSpecName: "scripts") pod "67400081-5ed3-48dc-be64-b3cf19bcf3c4" (UID: "67400081-5ed3-48dc-be64-b3cf19bcf3c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.276377 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7768c654-4c5b-44f0-944d-4c4507a252b3-kube-api-access-slkpv" (OuterVolumeSpecName: "kube-api-access-slkpv") pod "7768c654-4c5b-44f0-944d-4c4507a252b3" (UID: "7768c654-4c5b-44f0-944d-4c4507a252b3"). InnerVolumeSpecName "kube-api-access-slkpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.310860 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67400081-5ed3-48dc-be64-b3cf19bcf3c4" (UID: "67400081-5ed3-48dc-be64-b3cf19bcf3c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.313184 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-config-data" (OuterVolumeSpecName: "config-data") pod "67400081-5ed3-48dc-be64-b3cf19bcf3c4" (UID: "67400081-5ed3-48dc-be64-b3cf19bcf3c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.336627 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9xlm\" (UniqueName: \"kubernetes.io/projected/67400081-5ed3-48dc-be64-b3cf19bcf3c4-kube-api-access-k9xlm\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.336671 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.336681 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.336690 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slkpv\" (UniqueName: \"kubernetes.io/projected/7768c654-4c5b-44f0-944d-4c4507a252b3-kube-api-access-slkpv\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.336700 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.336710 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67400081-5ed3-48dc-be64-b3cf19bcf3c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.337999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7768c654-4c5b-44f0-944d-4c4507a252b3" (UID: "7768c654-4c5b-44f0-944d-4c4507a252b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.351586 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-config-data" (OuterVolumeSpecName: "config-data") pod "7768c654-4c5b-44f0-944d-4c4507a252b3" (UID: "7768c654-4c5b-44f0-944d-4c4507a252b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.438350 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.438385 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7768c654-4c5b-44f0-944d-4c4507a252b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.579091 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkw44" event={"ID":"67400081-5ed3-48dc-be64-b3cf19bcf3c4","Type":"ContainerDied","Data":"587d624057241591ff6368b0c2e78f5d5a1358265413d54a89ed1db573030ed1"} Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.579133 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="587d624057241591ff6368b0c2e78f5d5a1358265413d54a89ed1db573030ed1" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.579338 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkw44" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.588266 4962 generic.go:334] "Generic (PLEG): container finished" podID="143264d1-7bcc-47d8-aa73-047017954ff4" containerID="11088df8f83d28c23c00a477e48a59fc59a6392d7d1dfee3bdd6b3d0b252f17d" exitCode=0 Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.588338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" event={"ID":"143264d1-7bcc-47d8-aa73-047017954ff4","Type":"ContainerDied","Data":"11088df8f83d28c23c00a477e48a59fc59a6392d7d1dfee3bdd6b3d0b252f17d"} Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.591880 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c6gvp" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.595420 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c6gvp" event={"ID":"7768c654-4c5b-44f0-944d-4c4507a252b3","Type":"ContainerDied","Data":"71ac8cc085f4132583c17249b9f84cc4f8703d1f85ce678333204137c65ea8bc"} Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.595463 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71ac8cc085f4132583c17249b9f84cc4f8703d1f85ce678333204137c65ea8bc" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.634846 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.638678 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.689947 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 13:14:58 crc kubenswrapper[4962]: E1003 13:14:58.690598 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67400081-5ed3-48dc-be64-b3cf19bcf3c4" containerName="nova-cell1-conductor-db-sync" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.690615 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="67400081-5ed3-48dc-be64-b3cf19bcf3c4" containerName="nova-cell1-conductor-db-sync" Oct 03 13:14:58 crc kubenswrapper[4962]: E1003 13:14:58.690626 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143264d1-7bcc-47d8-aa73-047017954ff4" containerName="init" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.690647 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="143264d1-7bcc-47d8-aa73-047017954ff4" containerName="init" Oct 03 13:14:58 crc kubenswrapper[4962]: E1003 13:14:58.690666 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7768c654-4c5b-44f0-944d-4c4507a252b3" containerName="nova-manage" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.690672 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7768c654-4c5b-44f0-944d-4c4507a252b3" containerName="nova-manage" Oct 03 13:14:58 crc kubenswrapper[4962]: E1003 13:14:58.690689 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143264d1-7bcc-47d8-aa73-047017954ff4" containerName="dnsmasq-dns" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.690695 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="143264d1-7bcc-47d8-aa73-047017954ff4" containerName="dnsmasq-dns" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.690865 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="143264d1-7bcc-47d8-aa73-047017954ff4" containerName="dnsmasq-dns" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.690879 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7768c654-4c5b-44f0-944d-4c4507a252b3" containerName="nova-manage" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.690900 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="67400081-5ed3-48dc-be64-b3cf19bcf3c4" containerName="nova-cell1-conductor-db-sync" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.691446 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.699238 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.723717 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.743052 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7p2n\" (UniqueName: \"kubernetes.io/projected/143264d1-7bcc-47d8-aa73-047017954ff4-kube-api-access-f7p2n\") pod \"143264d1-7bcc-47d8-aa73-047017954ff4\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.743165 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-nb\") pod \"143264d1-7bcc-47d8-aa73-047017954ff4\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.743261 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-sb\") pod \"143264d1-7bcc-47d8-aa73-047017954ff4\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.743283 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-config\") pod \"143264d1-7bcc-47d8-aa73-047017954ff4\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.743302 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-swift-storage-0\") pod \"143264d1-7bcc-47d8-aa73-047017954ff4\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.743378 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-svc\") pod \"143264d1-7bcc-47d8-aa73-047017954ff4\" (UID: \"143264d1-7bcc-47d8-aa73-047017954ff4\") " Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.756077 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143264d1-7bcc-47d8-aa73-047017954ff4-kube-api-access-f7p2n" (OuterVolumeSpecName: "kube-api-access-f7p2n") pod "143264d1-7bcc-47d8-aa73-047017954ff4" (UID: "143264d1-7bcc-47d8-aa73-047017954ff4"). InnerVolumeSpecName "kube-api-access-f7p2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.808849 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.811715 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "143264d1-7bcc-47d8-aa73-047017954ff4" (UID: "143264d1-7bcc-47d8-aa73-047017954ff4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.817405 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-config" (OuterVolumeSpecName: "config") pod "143264d1-7bcc-47d8-aa73-047017954ff4" (UID: "143264d1-7bcc-47d8-aa73-047017954ff4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.818292 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "143264d1-7bcc-47d8-aa73-047017954ff4" (UID: "143264d1-7bcc-47d8-aa73-047017954ff4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.837428 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "143264d1-7bcc-47d8-aa73-047017954ff4" (UID: "143264d1-7bcc-47d8-aa73-047017954ff4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.845823 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.845953 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.845984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwbm5\" (UniqueName: \"kubernetes.io/projected/d22955d6-a957-458f-8181-5fea18cedc90-kube-api-access-rwbm5\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.846073 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.846086 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.846095 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.846103 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.846111 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7p2n\" (UniqueName: \"kubernetes.io/projected/143264d1-7bcc-47d8-aa73-047017954ff4-kube-api-access-f7p2n\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.846289 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "143264d1-7bcc-47d8-aa73-047017954ff4" (UID: "143264d1-7bcc-47d8-aa73-047017954ff4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.851232 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.891558 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.891804 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-log" containerID="cri-o://4d07108b0f72ed3f3f47d8bf2c579a7794c61bf89739b389232682126f154d11" gracePeriod=30 Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.891898 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-api" containerID="cri-o://44155ad59a3bdf5bb854fbb4c0744bb3fd660fd870b7e075d9e28970f5dfa273" gracePeriod=30 Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.919451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.919506 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.929276 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.947600 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.947735 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwbm5\" (UniqueName: \"kubernetes.io/projected/d22955d6-a957-458f-8181-5fea18cedc90-kube-api-access-rwbm5\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.947825 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.947879 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143264d1-7bcc-47d8-aa73-047017954ff4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.954381 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.964466 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:58 crc kubenswrapper[4962]: I1003 13:14:58.966859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwbm5\" (UniqueName: \"kubernetes.io/projected/d22955d6-a957-458f-8181-5fea18cedc90-kube-api-access-rwbm5\") pod \"nova-cell1-conductor-0\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.017296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.461318 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.603706 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" event={"ID":"143264d1-7bcc-47d8-aa73-047017954ff4","Type":"ContainerDied","Data":"da873234757532eac321f4323009fc887086ba08c6dca1ac0f713be079b2c000"} Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.603760 4962 scope.go:117] "RemoveContainer" containerID="11088df8f83d28c23c00a477e48a59fc59a6392d7d1dfee3bdd6b3d0b252f17d" Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.603892 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tgc99" Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.619794 4962 generic.go:334] "Generic (PLEG): container finished" podID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerID="4d07108b0f72ed3f3f47d8bf2c579a7794c61bf89739b389232682126f154d11" exitCode=143 Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.619855 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0863fcf4-ac99-4685-a482-08e2a3409f4d","Type":"ContainerDied","Data":"4d07108b0f72ed3f3f47d8bf2c579a7794c61bf89739b389232682126f154d11"} Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.620198 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerName="nova-metadata-log" containerID="cri-o://cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894" gracePeriod=30 Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.621379 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerName="nova-metadata-metadata" containerID="cri-o://8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57" gracePeriod=30 Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.644469 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.654289 4962 scope.go:117] "RemoveContainer" containerID="183854e4e4df735b62ad5a86e902e0394b30db0b0f4ce13143287ec7a88d4fca" Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.702975 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tgc99"] Oct 03 13:14:59 crc kubenswrapper[4962]: I1003 13:14:59.719198 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tgc99"] Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.155655 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk"] Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.157891 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.159746 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.160066 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.193695 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk"] Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.202285 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.202507 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="dcbfa307-0a06-44c4-97f5-e25b6fdc50d5" containerName="kube-state-metrics" containerID="cri-o://0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801" gracePeriod=30 Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.253874 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143264d1-7bcc-47d8-aa73-047017954ff4" path="/var/lib/kubelet/pods/143264d1-7bcc-47d8-aa73-047017954ff4/volumes" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.301343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a089db49-8a99-46c5-99ca-732f3aa1168d-config-volume\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.301600 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wmnm\" (UniqueName: \"kubernetes.io/projected/a089db49-8a99-46c5-99ca-732f3aa1168d-kube-api-access-7wmnm\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.301943 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a089db49-8a99-46c5-99ca-732f3aa1168d-secret-volume\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.359718 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.404075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wmnm\" (UniqueName: \"kubernetes.io/projected/a089db49-8a99-46c5-99ca-732f3aa1168d-kube-api-access-7wmnm\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.404200 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a089db49-8a99-46c5-99ca-732f3aa1168d-secret-volume\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.404273 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a089db49-8a99-46c5-99ca-732f3aa1168d-config-volume\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.407620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a089db49-8a99-46c5-99ca-732f3aa1168d-config-volume\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.418728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a089db49-8a99-46c5-99ca-732f3aa1168d-secret-volume\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.430166 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wmnm\" (UniqueName: \"kubernetes.io/projected/a089db49-8a99-46c5-99ca-732f3aa1168d-kube-api-access-7wmnm\") pod \"collect-profiles-29324955-9lcjk\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.506517 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht9wh\" (UniqueName: \"kubernetes.io/projected/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-kube-api-access-ht9wh\") pod \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.506790 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-combined-ca-bundle\") pod \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.506887 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-nova-metadata-tls-certs\") pod \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.507018 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-logs\") pod \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.507089 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-config-data\") pod \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\" (UID: \"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d\") " Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.510776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-logs" (OuterVolumeSpecName: "logs") pod "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" (UID: "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.522217 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-kube-api-access-ht9wh" (OuterVolumeSpecName: "kube-api-access-ht9wh") pod "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" (UID: "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d"). InnerVolumeSpecName "kube-api-access-ht9wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.537382 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" (UID: "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.547869 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-config-data" (OuterVolumeSpecName: "config-data") pod "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" (UID: "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.570219 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" (UID: "2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.595280 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.613906 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.615881 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.615905 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.615939 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.615955 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht9wh\" (UniqueName: \"kubernetes.io/projected/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d-kube-api-access-ht9wh\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.643939 4962 generic.go:334] "Generic (PLEG): container finished" podID="dcbfa307-0a06-44c4-97f5-e25b6fdc50d5" containerID="0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801" exitCode=2 Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.644007 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5","Type":"ContainerDied","Data":"0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801"} Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.644041 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5","Type":"ContainerDied","Data":"a78da3cf57fe279ef38d3f8756941cadfb67487891686452ed9deb81f7028826"} Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.644062 4962 scope.go:117] "RemoveContainer" containerID="0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.644183 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.653583 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d22955d6-a957-458f-8181-5fea18cedc90","Type":"ContainerStarted","Data":"0ce8b4a4ac9d0b8cc21e8feaa51be48f8d9e21fc13fb4c98b22efe982cb9565b"} Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.653625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d22955d6-a957-458f-8181-5fea18cedc90","Type":"ContainerStarted","Data":"f72fdcf624171197a72f081f96097a602fb6bd22864a9a806672b4ec38418965"} Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.654509 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.666110 4962 generic.go:334] "Generic (PLEG): container finished" podID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerID="8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57" exitCode=0 Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.666162 4962 generic.go:334] "Generic (PLEG): container finished" podID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerID="cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894" exitCode=143 Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.666355 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8199ec71-3fba-4ae9-9b74-6e41edd368aa" containerName="nova-scheduler-scheduler" containerID="cri-o://0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef" gracePeriod=30 Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.666883 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.668803 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d","Type":"ContainerDied","Data":"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57"} Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.668864 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d","Type":"ContainerDied","Data":"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894"} Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.668876 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d","Type":"ContainerDied","Data":"25f20a693b00f3a36989991c1e280d0ed2b975f52ab55938671e5219b97a3a15"} Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.675181 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.682277 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.682262957 podStartE2EDuration="2.682262957s" podCreationTimestamp="2025-10-03 13:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:00.680327075 +0000 UTC m=+1509.084224910" watchObservedRunningTime="2025-10-03 13:15:00.682262957 +0000 UTC m=+1509.086160792" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.682403 4962 scope.go:117] "RemoveContainer" containerID="0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801" Oct 03 13:15:00 crc kubenswrapper[4962]: E1003 13:15:00.684417 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801\": container with ID starting with 0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801 not found: ID does not exist" containerID="0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.684446 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801"} err="failed to get container status \"0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801\": rpc error: code = NotFound desc = could not find container \"0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801\": container with ID starting with 0f9f90f31810e9383bb8a842268d8dbcd02ea5d95f5c3bb8342e9bd294e72801 not found: ID does not exist" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.684472 4962 scope.go:117] "RemoveContainer" containerID="8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.731845 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzrnq\" (UniqueName: \"kubernetes.io/projected/dcbfa307-0a06-44c4-97f5-e25b6fdc50d5-kube-api-access-mzrnq\") pod \"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5\" (UID: \"dcbfa307-0a06-44c4-97f5-e25b6fdc50d5\") " Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.740784 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcbfa307-0a06-44c4-97f5-e25b6fdc50d5-kube-api-access-mzrnq" (OuterVolumeSpecName: "kube-api-access-mzrnq") pod "dcbfa307-0a06-44c4-97f5-e25b6fdc50d5" (UID: "dcbfa307-0a06-44c4-97f5-e25b6fdc50d5"). InnerVolumeSpecName "kube-api-access-mzrnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.743745 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.748815 4962 scope.go:117] "RemoveContainer" containerID="cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.757752 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.778691 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:00 crc kubenswrapper[4962]: E1003 13:15:00.779094 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerName="nova-metadata-metadata" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.779109 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerName="nova-metadata-metadata" Oct 03 13:15:00 crc kubenswrapper[4962]: E1003 13:15:00.779124 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcbfa307-0a06-44c4-97f5-e25b6fdc50d5" containerName="kube-state-metrics" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.779130 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcbfa307-0a06-44c4-97f5-e25b6fdc50d5" containerName="kube-state-metrics" Oct 03 13:15:00 crc kubenswrapper[4962]: E1003 13:15:00.779166 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerName="nova-metadata-log" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.779172 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerName="nova-metadata-log" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.779330 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcbfa307-0a06-44c4-97f5-e25b6fdc50d5" containerName="kube-state-metrics" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.779349 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerName="nova-metadata-metadata" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.779361 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" containerName="nova-metadata-log" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.780372 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.783307 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.783616 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.788351 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.796369 4962 scope.go:117] "RemoveContainer" containerID="8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57" Oct 03 13:15:00 crc kubenswrapper[4962]: E1003 13:15:00.801605 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57\": container with ID starting with 8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57 not found: ID does not exist" containerID="8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.801657 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57"} err="failed to get container status \"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57\": rpc error: code = NotFound desc = could not find container \"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57\": container with ID starting with 8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57 not found: ID does not exist" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.801702 4962 scope.go:117] "RemoveContainer" containerID="cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894" Oct 03 13:15:00 crc kubenswrapper[4962]: E1003 13:15:00.804564 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894\": container with ID starting with cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894 not found: ID does not exist" containerID="cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.804605 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894"} err="failed to get container status \"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894\": rpc error: code = NotFound desc = could not find container \"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894\": container with ID starting with cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894 not found: ID does not exist" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.804672 4962 scope.go:117] "RemoveContainer" containerID="8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.806420 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57"} err="failed to get container status \"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57\": rpc error: code = NotFound desc = could not find container \"8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57\": container with ID starting with 8d7e466af7add0ec2a7f42f27b591c61fbe1ab586139abb2dc3f35a3d6904f57 not found: ID does not exist" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.806453 4962 scope.go:117] "RemoveContainer" containerID="cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.810765 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894"} err="failed to get container status \"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894\": rpc error: code = NotFound desc = could not find container \"cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894\": container with ID starting with cb892e4cf68977e199a8ede645fb1658a56a41006d664a84b9c8b34cefb35894 not found: ID does not exist" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.835055 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzrnq\" (UniqueName: \"kubernetes.io/projected/dcbfa307-0a06-44c4-97f5-e25b6fdc50d5-kube-api-access-mzrnq\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.941817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.942118 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-config-data\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.942143 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsdp6\" (UniqueName: \"kubernetes.io/projected/870a8837-baa4-44c0-a740-32468cee8d28-kube-api-access-rsdp6\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.942190 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.942221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870a8837-baa4-44c0-a740-32468cee8d28-logs\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.983058 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:15:00 crc kubenswrapper[4962]: I1003 13:15:00.995007 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.017580 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.019516 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.022534 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.022766 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.026952 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.044090 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-config-data\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.044128 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsdp6\" (UniqueName: \"kubernetes.io/projected/870a8837-baa4-44c0-a740-32468cee8d28-kube-api-access-rsdp6\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.044177 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.044212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870a8837-baa4-44c0-a740-32468cee8d28-logs\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.044269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.048483 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870a8837-baa4-44c0-a740-32468cee8d28-logs\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.062402 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.068275 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsdp6\" (UniqueName: \"kubernetes.io/projected/870a8837-baa4-44c0-a740-32468cee8d28-kube-api-access-rsdp6\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.071572 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-config-data\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.075583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.076873 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk"] Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.106379 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.147039 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.147106 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srwdq\" (UniqueName: \"kubernetes.io/projected/72df0792-9904-4b64-9c70-37cb982fe24b-kube-api-access-srwdq\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.147135 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.147197 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.248744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.248792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srwdq\" (UniqueName: \"kubernetes.io/projected/72df0792-9904-4b64-9c70-37cb982fe24b-kube-api-access-srwdq\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.248822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.248880 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.254469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.255308 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.255463 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.285614 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srwdq\" (UniqueName: \"kubernetes.io/projected/72df0792-9904-4b64-9c70-37cb982fe24b-kube-api-access-srwdq\") pod \"kube-state-metrics-0\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.347051 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.627560 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:01 crc kubenswrapper[4962]: W1003 13:15:01.635218 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod870a8837_baa4_44c0_a740_32468cee8d28.slice/crio-565a8e6dbb4986224b8f1b15f699085bdcd0496de1cd19fb57eb5bed1dd78949 WatchSource:0}: Error finding container 565a8e6dbb4986224b8f1b15f699085bdcd0496de1cd19fb57eb5bed1dd78949: Status 404 returned error can't find the container with id 565a8e6dbb4986224b8f1b15f699085bdcd0496de1cd19fb57eb5bed1dd78949 Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.681088 4962 generic.go:334] "Generic (PLEG): container finished" podID="a089db49-8a99-46c5-99ca-732f3aa1168d" containerID="910b846a7bb4ec67328350d581d76033e7a5b5f7605498fd753370a867f94313" exitCode=0 Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.681531 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" event={"ID":"a089db49-8a99-46c5-99ca-732f3aa1168d","Type":"ContainerDied","Data":"910b846a7bb4ec67328350d581d76033e7a5b5f7605498fd753370a867f94313"} Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.681562 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" event={"ID":"a089db49-8a99-46c5-99ca-732f3aa1168d","Type":"ContainerStarted","Data":"3041aa2b01c9d697d73560ccb736bbc98ce81177bdde4c407d765984cce10500"} Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.686401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870a8837-baa4-44c0-a740-32468cee8d28","Type":"ContainerStarted","Data":"565a8e6dbb4986224b8f1b15f699085bdcd0496de1cd19fb57eb5bed1dd78949"} Oct 03 13:15:01 crc kubenswrapper[4962]: I1003 13:15:01.854610 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:15:01 crc kubenswrapper[4962]: W1003 13:15:01.863818 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72df0792_9904_4b64_9c70_37cb982fe24b.slice/crio-9d94f0c4c9cc4d34a04158bb731745ee950f7d1d6523896dd0e51db3ffc6ff97 WatchSource:0}: Error finding container 9d94f0c4c9cc4d34a04158bb731745ee950f7d1d6523896dd0e51db3ffc6ff97: Status 404 returned error can't find the container with id 9d94f0c4c9cc4d34a04158bb731745ee950f7d1d6523896dd0e51db3ffc6ff97 Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.092524 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.092880 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="ceilometer-central-agent" containerID="cri-o://5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776" gracePeriod=30 Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.092935 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="ceilometer-notification-agent" containerID="cri-o://3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3" gracePeriod=30 Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.092973 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="proxy-httpd" containerID="cri-o://60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779" gracePeriod=30 Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.092939 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="sg-core" containerID="cri-o://a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40" gracePeriod=30 Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.245248 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d" path="/var/lib/kubelet/pods/2efa1ef6-5df5-4f5c-8a9d-e32460f07a7d/volumes" Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.246072 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcbfa307-0a06-44c4-97f5-e25b6fdc50d5" path="/var/lib/kubelet/pods/dcbfa307-0a06-44c4-97f5-e25b6fdc50d5/volumes" Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.697266 4962 generic.go:334] "Generic (PLEG): container finished" podID="907629fb-4025-4486-be7b-c511c22fc6c1" containerID="60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779" exitCode=0 Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.697504 4962 generic.go:334] "Generic (PLEG): container finished" podID="907629fb-4025-4486-be7b-c511c22fc6c1" containerID="a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40" exitCode=2 Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.697345 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerDied","Data":"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779"} Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.697566 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerDied","Data":"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40"} Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.699517 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"72df0792-9904-4b64-9c70-37cb982fe24b","Type":"ContainerStarted","Data":"46f6912f8f25e900b01e11da327d2482ee9d18c20a5ec8e5af15a90619d45612"} Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.699541 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"72df0792-9904-4b64-9c70-37cb982fe24b","Type":"ContainerStarted","Data":"9d94f0c4c9cc4d34a04158bb731745ee950f7d1d6523896dd0e51db3ffc6ff97"} Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.699734 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.701665 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870a8837-baa4-44c0-a740-32468cee8d28","Type":"ContainerStarted","Data":"b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd"} Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.701705 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870a8837-baa4-44c0-a740-32468cee8d28","Type":"ContainerStarted","Data":"cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17"} Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.723201 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.306241277 podStartE2EDuration="2.723183196s" podCreationTimestamp="2025-10-03 13:15:00 +0000 UTC" firstStartedPulling="2025-10-03 13:15:01.866415974 +0000 UTC m=+1510.270313809" lastFinishedPulling="2025-10-03 13:15:02.283357893 +0000 UTC m=+1510.687255728" observedRunningTime="2025-10-03 13:15:02.715067068 +0000 UTC m=+1511.118964903" watchObservedRunningTime="2025-10-03 13:15:02.723183196 +0000 UTC m=+1511.127081031" Oct 03 13:15:02 crc kubenswrapper[4962]: I1003 13:15:02.736076 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.736050432 podStartE2EDuration="2.736050432s" podCreationTimestamp="2025-10-03 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:02.733469593 +0000 UTC m=+1511.137367438" watchObservedRunningTime="2025-10-03 13:15:02.736050432 +0000 UTC m=+1511.139948267" Oct 03 13:15:02 crc kubenswrapper[4962]: E1003 13:15:02.775943 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:15:02 crc kubenswrapper[4962]: E1003 13:15:02.777444 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:15:02 crc kubenswrapper[4962]: E1003 13:15:02.778930 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:15:02 crc kubenswrapper[4962]: E1003 13:15:02.778967 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8199ec71-3fba-4ae9-9b74-6e41edd368aa" containerName="nova-scheduler-scheduler" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.309857 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.398093 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wmnm\" (UniqueName: \"kubernetes.io/projected/a089db49-8a99-46c5-99ca-732f3aa1168d-kube-api-access-7wmnm\") pod \"a089db49-8a99-46c5-99ca-732f3aa1168d\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.398220 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a089db49-8a99-46c5-99ca-732f3aa1168d-secret-volume\") pod \"a089db49-8a99-46c5-99ca-732f3aa1168d\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.398295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a089db49-8a99-46c5-99ca-732f3aa1168d-config-volume\") pod \"a089db49-8a99-46c5-99ca-732f3aa1168d\" (UID: \"a089db49-8a99-46c5-99ca-732f3aa1168d\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.399418 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a089db49-8a99-46c5-99ca-732f3aa1168d-config-volume" (OuterVolumeSpecName: "config-volume") pod "a089db49-8a99-46c5-99ca-732f3aa1168d" (UID: "a089db49-8a99-46c5-99ca-732f3aa1168d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.403322 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a089db49-8a99-46c5-99ca-732f3aa1168d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a089db49-8a99-46c5-99ca-732f3aa1168d" (UID: "a089db49-8a99-46c5-99ca-732f3aa1168d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.404455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a089db49-8a99-46c5-99ca-732f3aa1168d-kube-api-access-7wmnm" (OuterVolumeSpecName: "kube-api-access-7wmnm") pod "a089db49-8a99-46c5-99ca-732f3aa1168d" (UID: "a089db49-8a99-46c5-99ca-732f3aa1168d"). InnerVolumeSpecName "kube-api-access-7wmnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.440124 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.473605 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod907629fb_4025_4486_be7b_c511c22fc6c1.slice/crio-conmon-3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod907629fb_4025_4486_be7b_c511c22fc6c1.slice/crio-3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3.scope\": RecentStats: unable to find data in memory cache]" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.500821 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wmnm\" (UniqueName: \"kubernetes.io/projected/a089db49-8a99-46c5-99ca-732f3aa1168d-kube-api-access-7wmnm\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.500863 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a089db49-8a99-46c5-99ca-732f3aa1168d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.500873 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a089db49-8a99-46c5-99ca-732f3aa1168d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.568854 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.601910 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwtpl\" (UniqueName: \"kubernetes.io/projected/8199ec71-3fba-4ae9-9b74-6e41edd368aa-kube-api-access-rwtpl\") pod \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.602028 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-combined-ca-bundle\") pod \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.602178 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-config-data\") pod \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\" (UID: \"8199ec71-3fba-4ae9-9b74-6e41edd368aa\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.617541 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8199ec71-3fba-4ae9-9b74-6e41edd368aa-kube-api-access-rwtpl" (OuterVolumeSpecName: "kube-api-access-rwtpl") pod "8199ec71-3fba-4ae9-9b74-6e41edd368aa" (UID: "8199ec71-3fba-4ae9-9b74-6e41edd368aa"). InnerVolumeSpecName "kube-api-access-rwtpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.633764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-config-data" (OuterVolumeSpecName: "config-data") pod "8199ec71-3fba-4ae9-9b74-6e41edd368aa" (UID: "8199ec71-3fba-4ae9-9b74-6e41edd368aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.639808 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8199ec71-3fba-4ae9-9b74-6e41edd368aa" (UID: "8199ec71-3fba-4ae9-9b74-6e41edd368aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.703364 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-run-httpd\") pod \"907629fb-4025-4486-be7b-c511c22fc6c1\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.703433 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-log-httpd\") pod \"907629fb-4025-4486-be7b-c511c22fc6c1\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.703468 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-sg-core-conf-yaml\") pod \"907629fb-4025-4486-be7b-c511c22fc6c1\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.703568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-config-data\") pod \"907629fb-4025-4486-be7b-c511c22fc6c1\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.703623 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5lgf\" (UniqueName: \"kubernetes.io/projected/907629fb-4025-4486-be7b-c511c22fc6c1-kube-api-access-w5lgf\") pod \"907629fb-4025-4486-be7b-c511c22fc6c1\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.703692 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-scripts\") pod \"907629fb-4025-4486-be7b-c511c22fc6c1\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.703744 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-combined-ca-bundle\") pod \"907629fb-4025-4486-be7b-c511c22fc6c1\" (UID: \"907629fb-4025-4486-be7b-c511c22fc6c1\") " Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.703938 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "907629fb-4025-4486-be7b-c511c22fc6c1" (UID: "907629fb-4025-4486-be7b-c511c22fc6c1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.704071 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "907629fb-4025-4486-be7b-c511c22fc6c1" (UID: "907629fb-4025-4486-be7b-c511c22fc6c1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.704594 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.704613 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.704624 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwtpl\" (UniqueName: \"kubernetes.io/projected/8199ec71-3fba-4ae9-9b74-6e41edd368aa-kube-api-access-rwtpl\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.704653 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907629fb-4025-4486-be7b-c511c22fc6c1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.704663 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8199ec71-3fba-4ae9-9b74-6e41edd368aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.708437 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-scripts" (OuterVolumeSpecName: "scripts") pod "907629fb-4025-4486-be7b-c511c22fc6c1" (UID: "907629fb-4025-4486-be7b-c511c22fc6c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.716772 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907629fb-4025-4486-be7b-c511c22fc6c1-kube-api-access-w5lgf" (OuterVolumeSpecName: "kube-api-access-w5lgf") pod "907629fb-4025-4486-be7b-c511c22fc6c1" (UID: "907629fb-4025-4486-be7b-c511c22fc6c1"). InnerVolumeSpecName "kube-api-access-w5lgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.724662 4962 generic.go:334] "Generic (PLEG): container finished" podID="907629fb-4025-4486-be7b-c511c22fc6c1" containerID="3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3" exitCode=0 Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.724704 4962 generic.go:334] "Generic (PLEG): container finished" podID="907629fb-4025-4486-be7b-c511c22fc6c1" containerID="5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776" exitCode=0 Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.724730 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.724726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerDied","Data":"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3"} Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.724865 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerDied","Data":"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776"} Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.724887 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907629fb-4025-4486-be7b-c511c22fc6c1","Type":"ContainerDied","Data":"13fbdbe7f154d3138a3d07bf1a1c7880686ab6ec24311f10e66c82f6424f4ee2"} Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.724909 4962 scope.go:117] "RemoveContainer" containerID="60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.726257 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.726270 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk" event={"ID":"a089db49-8a99-46c5-99ca-732f3aa1168d","Type":"ContainerDied","Data":"3041aa2b01c9d697d73560ccb736bbc98ce81177bdde4c407d765984cce10500"} Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.726778 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3041aa2b01c9d697d73560ccb736bbc98ce81177bdde4c407d765984cce10500" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.733792 4962 generic.go:334] "Generic (PLEG): container finished" podID="8199ec71-3fba-4ae9-9b74-6e41edd368aa" containerID="0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef" exitCode=0 Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.733871 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.733891 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8199ec71-3fba-4ae9-9b74-6e41edd368aa","Type":"ContainerDied","Data":"0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef"} Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.734326 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8199ec71-3fba-4ae9-9b74-6e41edd368aa","Type":"ContainerDied","Data":"d68a03ee6dd9708dd198083e44f6720a95fd141deabd0bd6b3aef796f6834b59"} Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.747034 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "907629fb-4025-4486-be7b-c511c22fc6c1" (UID: "907629fb-4025-4486-be7b-c511c22fc6c1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.751181 4962 scope.go:117] "RemoveContainer" containerID="a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.780873 4962 scope.go:117] "RemoveContainer" containerID="3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.787072 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.800626 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.806668 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5lgf\" (UniqueName: \"kubernetes.io/projected/907629fb-4025-4486-be7b-c511c22fc6c1-kube-api-access-w5lgf\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.806696 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.806707 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.811042 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "907629fb-4025-4486-be7b-c511c22fc6c1" (UID: "907629fb-4025-4486-be7b-c511c22fc6c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814156 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.814577 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="proxy-httpd" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814594 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="proxy-httpd" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.814616 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="sg-core" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814623 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="sg-core" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.814653 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a089db49-8a99-46c5-99ca-732f3aa1168d" containerName="collect-profiles" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814661 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a089db49-8a99-46c5-99ca-732f3aa1168d" containerName="collect-profiles" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.814685 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8199ec71-3fba-4ae9-9b74-6e41edd368aa" containerName="nova-scheduler-scheduler" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814692 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8199ec71-3fba-4ae9-9b74-6e41edd368aa" containerName="nova-scheduler-scheduler" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.814711 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="ceilometer-central-agent" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814718 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="ceilometer-central-agent" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.814732 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="ceilometer-notification-agent" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814739 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="ceilometer-notification-agent" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814855 4962 scope.go:117] "RemoveContainer" containerID="5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814919 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="ceilometer-central-agent" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814929 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="proxy-httpd" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814937 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8199ec71-3fba-4ae9-9b74-6e41edd368aa" containerName="nova-scheduler-scheduler" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814954 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="ceilometer-notification-agent" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814962 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a089db49-8a99-46c5-99ca-732f3aa1168d" containerName="collect-profiles" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.814970 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" containerName="sg-core" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.815554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.817372 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.825939 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.858215 4962 scope.go:117] "RemoveContainer" containerID="60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.859330 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779\": container with ID starting with 60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779 not found: ID does not exist" containerID="60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.859385 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779"} err="failed to get container status \"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779\": rpc error: code = NotFound desc = could not find container \"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779\": container with ID starting with 60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779 not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.859418 4962 scope.go:117] "RemoveContainer" containerID="a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.859890 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40\": container with ID starting with a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40 not found: ID does not exist" containerID="a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.859918 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40"} err="failed to get container status \"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40\": rpc error: code = NotFound desc = could not find container \"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40\": container with ID starting with a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40 not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.859937 4962 scope.go:117] "RemoveContainer" containerID="3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.860233 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3\": container with ID starting with 3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3 not found: ID does not exist" containerID="3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.860322 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3"} err="failed to get container status \"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3\": rpc error: code = NotFound desc = could not find container \"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3\": container with ID starting with 3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3 not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.860399 4962 scope.go:117] "RemoveContainer" containerID="5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.860778 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776\": container with ID starting with 5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776 not found: ID does not exist" containerID="5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.860875 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776"} err="failed to get container status \"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776\": rpc error: code = NotFound desc = could not find container \"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776\": container with ID starting with 5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776 not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.860947 4962 scope.go:117] "RemoveContainer" containerID="60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.861388 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779"} err="failed to get container status \"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779\": rpc error: code = NotFound desc = could not find container \"60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779\": container with ID starting with 60f2bcc003ebe0b2d772a16ef7e46868e468152e0a2969be9e595bcf65025779 not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.861431 4962 scope.go:117] "RemoveContainer" containerID="a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.861822 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40"} err="failed to get container status \"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40\": rpc error: code = NotFound desc = could not find container \"a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40\": container with ID starting with a46165c3bc28aa8609cfce450481aa24eee37022bd65ea99246753d66b764c40 not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.861844 4962 scope.go:117] "RemoveContainer" containerID="3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.862069 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3"} err="failed to get container status \"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3\": rpc error: code = NotFound desc = could not find container \"3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3\": container with ID starting with 3c88043411cf3aeda492736b07400572227f3a8837c21ef313ad45fdb1d831d3 not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.862090 4962 scope.go:117] "RemoveContainer" containerID="5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.862281 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776"} err="failed to get container status \"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776\": rpc error: code = NotFound desc = could not find container \"5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776\": container with ID starting with 5a5902c05c8e657a20b775cb27fd17cc36f8f83f75daa55624bc15babc995776 not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.862299 4962 scope.go:117] "RemoveContainer" containerID="0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.863902 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-config-data" (OuterVolumeSpecName: "config-data") pod "907629fb-4025-4486-be7b-c511c22fc6c1" (UID: "907629fb-4025-4486-be7b-c511c22fc6c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.883850 4962 scope.go:117] "RemoveContainer" containerID="0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef" Oct 03 13:15:03 crc kubenswrapper[4962]: E1003 13:15:03.884342 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef\": container with ID starting with 0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef not found: ID does not exist" containerID="0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.884377 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef"} err="failed to get container status \"0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef\": rpc error: code = NotFound desc = could not find container \"0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef\": container with ID starting with 0f3e23b7882dfbb7d83de826edf2882cca09869217f1b3a126da17c0af86ecef not found: ID does not exist" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.908404 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-config-data\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.908470 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.908562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8cjf\" (UniqueName: \"kubernetes.io/projected/21112027-f328-4fb2-b35a-1d14ac85a5ca-kube-api-access-f8cjf\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.908610 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:03 crc kubenswrapper[4962]: I1003 13:15:03.908621 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907629fb-4025-4486-be7b-c511c22fc6c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.009590 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-config-data\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.009671 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.009761 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8cjf\" (UniqueName: \"kubernetes.io/projected/21112027-f328-4fb2-b35a-1d14ac85a5ca-kube-api-access-f8cjf\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.013872 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-config-data\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.023472 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.025667 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8cjf\" (UniqueName: \"kubernetes.io/projected/21112027-f328-4fb2-b35a-1d14ac85a5ca-kube-api-access-f8cjf\") pod \"nova-scheduler-0\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.049120 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.121480 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.131859 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.136885 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.142912 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.145229 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.147782 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.147938 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.148068 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.158390 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.213022 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzbt6\" (UniqueName: \"kubernetes.io/projected/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-kube-api-access-vzbt6\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.213104 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-scripts\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.213324 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-config-data\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.213379 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.213409 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-run-httpd\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.213428 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.213487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-log-httpd\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.213711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.238693 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8199ec71-3fba-4ae9-9b74-6e41edd368aa" path="/var/lib/kubelet/pods/8199ec71-3fba-4ae9-9b74-6e41edd368aa/volumes" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.239472 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907629fb-4025-4486-be7b-c511c22fc6c1" path="/var/lib/kubelet/pods/907629fb-4025-4486-be7b-c511c22fc6c1/volumes" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.315755 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-config-data\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.315978 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.315997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-run-httpd\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.316012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.316042 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-log-httpd\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.316904 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.316341 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-log-httpd\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.316998 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzbt6\" (UniqueName: \"kubernetes.io/projected/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-kube-api-access-vzbt6\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.317100 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-scripts\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.316330 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-run-httpd\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.321269 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.321488 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-scripts\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.321612 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.323184 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-config-data\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.325113 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.338820 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzbt6\" (UniqueName: \"kubernetes.io/projected/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-kube-api-access-vzbt6\") pod \"ceilometer-0\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.565625 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.600538 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:04 crc kubenswrapper[4962]: W1003 13:15:04.603645 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21112027_f328_4fb2_b35a_1d14ac85a5ca.slice/crio-784acff432d9d0ceea2eecb2fcde0508ab251f4cb5aeed569bc7f1f4de9d1a2a WatchSource:0}: Error finding container 784acff432d9d0ceea2eecb2fcde0508ab251f4cb5aeed569bc7f1f4de9d1a2a: Status 404 returned error can't find the container with id 784acff432d9d0ceea2eecb2fcde0508ab251f4cb5aeed569bc7f1f4de9d1a2a Oct 03 13:15:04 crc kubenswrapper[4962]: I1003 13:15:04.753284 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21112027-f328-4fb2-b35a-1d14ac85a5ca","Type":"ContainerStarted","Data":"784acff432d9d0ceea2eecb2fcde0508ab251f4cb5aeed569bc7f1f4de9d1a2a"} Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.058451 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:05 crc kubenswrapper[4962]: W1003 13:15:05.061548 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8aab2bd_24a7_4d0e_9a40_d2f5aa30028e.slice/crio-7099cb3c5ce97811d143ef0576668b0fa1c144347635397f0a49c066eed22ed1 WatchSource:0}: Error finding container 7099cb3c5ce97811d143ef0576668b0fa1c144347635397f0a49c066eed22ed1: Status 404 returned error can't find the container with id 7099cb3c5ce97811d143ef0576668b0fa1c144347635397f0a49c066eed22ed1 Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.777140 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21112027-f328-4fb2-b35a-1d14ac85a5ca","Type":"ContainerStarted","Data":"a8a6f40dc6127a6a933cf6d7715114b36ef34c0faa873e7229276b216a7ed0b8"} Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.779645 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerStarted","Data":"7099cb3c5ce97811d143ef0576668b0fa1c144347635397f0a49c066eed22ed1"} Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.801435 4962 generic.go:334] "Generic (PLEG): container finished" podID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerID="44155ad59a3bdf5bb854fbb4c0744bb3fd660fd870b7e075d9e28970f5dfa273" exitCode=0 Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.801484 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0863fcf4-ac99-4685-a482-08e2a3409f4d","Type":"ContainerDied","Data":"44155ad59a3bdf5bb854fbb4c0744bb3fd660fd870b7e075d9e28970f5dfa273"} Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.815482 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.835697 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.835657607 podStartE2EDuration="2.835657607s" podCreationTimestamp="2025-10-03 13:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:05.797004759 +0000 UTC m=+1514.200902594" watchObservedRunningTime="2025-10-03 13:15:05.835657607 +0000 UTC m=+1514.239555442" Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.950473 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-config-data\") pod \"0863fcf4-ac99-4685-a482-08e2a3409f4d\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.950574 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x7mg\" (UniqueName: \"kubernetes.io/projected/0863fcf4-ac99-4685-a482-08e2a3409f4d-kube-api-access-8x7mg\") pod \"0863fcf4-ac99-4685-a482-08e2a3409f4d\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.950615 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-combined-ca-bundle\") pod \"0863fcf4-ac99-4685-a482-08e2a3409f4d\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.950731 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0863fcf4-ac99-4685-a482-08e2a3409f4d-logs\") pod \"0863fcf4-ac99-4685-a482-08e2a3409f4d\" (UID: \"0863fcf4-ac99-4685-a482-08e2a3409f4d\") " Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.951305 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0863fcf4-ac99-4685-a482-08e2a3409f4d-logs" (OuterVolumeSpecName: "logs") pod "0863fcf4-ac99-4685-a482-08e2a3409f4d" (UID: "0863fcf4-ac99-4685-a482-08e2a3409f4d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.958345 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0863fcf4-ac99-4685-a482-08e2a3409f4d-kube-api-access-8x7mg" (OuterVolumeSpecName: "kube-api-access-8x7mg") pod "0863fcf4-ac99-4685-a482-08e2a3409f4d" (UID: "0863fcf4-ac99-4685-a482-08e2a3409f4d"). InnerVolumeSpecName "kube-api-access-8x7mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.980420 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-config-data" (OuterVolumeSpecName: "config-data") pod "0863fcf4-ac99-4685-a482-08e2a3409f4d" (UID: "0863fcf4-ac99-4685-a482-08e2a3409f4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:05 crc kubenswrapper[4962]: I1003 13:15:05.986766 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0863fcf4-ac99-4685-a482-08e2a3409f4d" (UID: "0863fcf4-ac99-4685-a482-08e2a3409f4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.052658 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.052836 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x7mg\" (UniqueName: \"kubernetes.io/projected/0863fcf4-ac99-4685-a482-08e2a3409f4d-kube-api-access-8x7mg\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.052896 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863fcf4-ac99-4685-a482-08e2a3409f4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.052984 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0863fcf4-ac99-4685-a482-08e2a3409f4d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.107121 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.107701 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.819752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerStarted","Data":"147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b"} Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.820158 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerStarted","Data":"b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9"} Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.823277 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0863fcf4-ac99-4685-a482-08e2a3409f4d","Type":"ContainerDied","Data":"ecceab8ccb4ab53fde48a94a89980965537c7e79234c7918613201b637413bc0"} Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.823372 4962 scope.go:117] "RemoveContainer" containerID="44155ad59a3bdf5bb854fbb4c0744bb3fd660fd870b7e075d9e28970f5dfa273" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.823581 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.851082 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.854839 4962 scope.go:117] "RemoveContainer" containerID="4d07108b0f72ed3f3f47d8bf2c579a7794c61bf89739b389232682126f154d11" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.871671 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.881754 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:06 crc kubenswrapper[4962]: E1003 13:15:06.882291 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-api" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.882314 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-api" Oct 03 13:15:06 crc kubenswrapper[4962]: E1003 13:15:06.882362 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-log" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.882371 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-log" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.882908 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-log" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.882999 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" containerName="nova-api-api" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.884764 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.888974 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.911384 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.970344 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpc9k\" (UniqueName: \"kubernetes.io/projected/d699135f-ef5e-43df-8f1d-37bddf62896a-kube-api-access-bpc9k\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.970469 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-config-data\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.970526 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d699135f-ef5e-43df-8f1d-37bddf62896a-logs\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:06 crc kubenswrapper[4962]: I1003 13:15:06.970754 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.073071 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpc9k\" (UniqueName: \"kubernetes.io/projected/d699135f-ef5e-43df-8f1d-37bddf62896a-kube-api-access-bpc9k\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.073233 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-config-data\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.073295 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d699135f-ef5e-43df-8f1d-37bddf62896a-logs\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.073432 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.074085 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d699135f-ef5e-43df-8f1d-37bddf62896a-logs\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.077141 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.078898 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-config-data\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.090903 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpc9k\" (UniqueName: \"kubernetes.io/projected/d699135f-ef5e-43df-8f1d-37bddf62896a-kube-api-access-bpc9k\") pod \"nova-api-0\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.251866 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:07 crc kubenswrapper[4962]: W1003 13:15:07.765358 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd699135f_ef5e_43df_8f1d_37bddf62896a.slice/crio-b81bb1b8c0c35d39a601737a0766b1f5b6b995072d9fb390481b834f06c1d40f WatchSource:0}: Error finding container b81bb1b8c0c35d39a601737a0766b1f5b6b995072d9fb390481b834f06c1d40f: Status 404 returned error can't find the container with id b81bb1b8c0c35d39a601737a0766b1f5b6b995072d9fb390481b834f06c1d40f Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.772494 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.833245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d699135f-ef5e-43df-8f1d-37bddf62896a","Type":"ContainerStarted","Data":"b81bb1b8c0c35d39a601737a0766b1f5b6b995072d9fb390481b834f06c1d40f"} Oct 03 13:15:07 crc kubenswrapper[4962]: I1003 13:15:07.841791 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerStarted","Data":"ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d"} Oct 03 13:15:08 crc kubenswrapper[4962]: I1003 13:15:08.227168 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:15:08 crc kubenswrapper[4962]: E1003 13:15:08.227442 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:15:08 crc kubenswrapper[4962]: I1003 13:15:08.240295 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0863fcf4-ac99-4685-a482-08e2a3409f4d" path="/var/lib/kubelet/pods/0863fcf4-ac99-4685-a482-08e2a3409f4d/volumes" Oct 03 13:15:08 crc kubenswrapper[4962]: I1003 13:15:08.855187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d699135f-ef5e-43df-8f1d-37bddf62896a","Type":"ContainerStarted","Data":"31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002"} Oct 03 13:15:08 crc kubenswrapper[4962]: I1003 13:15:08.855629 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d699135f-ef5e-43df-8f1d-37bddf62896a","Type":"ContainerStarted","Data":"441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca"} Oct 03 13:15:08 crc kubenswrapper[4962]: I1003 13:15:08.871190 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.871166341 podStartE2EDuration="2.871166341s" podCreationTimestamp="2025-10-03 13:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:08.869452225 +0000 UTC m=+1517.273350070" watchObservedRunningTime="2025-10-03 13:15:08.871166341 +0000 UTC m=+1517.275064206" Oct 03 13:15:09 crc kubenswrapper[4962]: I1003 13:15:09.138261 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 13:15:09 crc kubenswrapper[4962]: I1003 13:15:09.864902 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerStarted","Data":"eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2"} Oct 03 13:15:10 crc kubenswrapper[4962]: I1003 13:15:10.876033 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 13:15:11 crc kubenswrapper[4962]: I1003 13:15:11.106961 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 13:15:11 crc kubenswrapper[4962]: I1003 13:15:11.107019 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 13:15:11 crc kubenswrapper[4962]: I1003 13:15:11.360859 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 13:15:11 crc kubenswrapper[4962]: I1003 13:15:11.379913 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.153087912 podStartE2EDuration="7.379894645s" podCreationTimestamp="2025-10-03 13:15:04 +0000 UTC" firstStartedPulling="2025-10-03 13:15:05.063702112 +0000 UTC m=+1513.467599947" lastFinishedPulling="2025-10-03 13:15:09.290508845 +0000 UTC m=+1517.694406680" observedRunningTime="2025-10-03 13:15:09.888081525 +0000 UTC m=+1518.291979370" watchObservedRunningTime="2025-10-03 13:15:11.379894645 +0000 UTC m=+1519.783792480" Oct 03 13:15:12 crc kubenswrapper[4962]: I1003 13:15:12.118842 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 13:15:12 crc kubenswrapper[4962]: I1003 13:15:12.118848 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 13:15:14 crc kubenswrapper[4962]: I1003 13:15:14.138330 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 13:15:14 crc kubenswrapper[4962]: I1003 13:15:14.179042 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 13:15:14 crc kubenswrapper[4962]: I1003 13:15:14.966551 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 13:15:17 crc kubenswrapper[4962]: I1003 13:15:17.252867 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 13:15:17 crc kubenswrapper[4962]: I1003 13:15:17.253373 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 13:15:18 crc kubenswrapper[4962]: I1003 13:15:18.339975 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 13:15:18 crc kubenswrapper[4962]: I1003 13:15:18.339990 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 13:15:20 crc kubenswrapper[4962]: I1003 13:15:20.227162 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:15:20 crc kubenswrapper[4962]: E1003 13:15:20.227838 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:15:21 crc kubenswrapper[4962]: I1003 13:15:21.113839 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 13:15:21 crc kubenswrapper[4962]: I1003 13:15:21.118403 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 13:15:21 crc kubenswrapper[4962]: I1003 13:15:21.121561 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 13:15:21 crc kubenswrapper[4962]: I1003 13:15:21.982508 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.812538 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.871386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2pmn\" (UniqueName: \"kubernetes.io/projected/6570f82b-7149-41ef-8bb6-aea99de50975-kube-api-access-g2pmn\") pod \"6570f82b-7149-41ef-8bb6-aea99de50975\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.871739 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-combined-ca-bundle\") pod \"6570f82b-7149-41ef-8bb6-aea99de50975\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.871948 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-config-data\") pod \"6570f82b-7149-41ef-8bb6-aea99de50975\" (UID: \"6570f82b-7149-41ef-8bb6-aea99de50975\") " Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.877321 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6570f82b-7149-41ef-8bb6-aea99de50975-kube-api-access-g2pmn" (OuterVolumeSpecName: "kube-api-access-g2pmn") pod "6570f82b-7149-41ef-8bb6-aea99de50975" (UID: "6570f82b-7149-41ef-8bb6-aea99de50975"). InnerVolumeSpecName "kube-api-access-g2pmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.903957 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6570f82b-7149-41ef-8bb6-aea99de50975" (UID: "6570f82b-7149-41ef-8bb6-aea99de50975"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.908404 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-config-data" (OuterVolumeSpecName: "config-data") pod "6570f82b-7149-41ef-8bb6-aea99de50975" (UID: "6570f82b-7149-41ef-8bb6-aea99de50975"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.974434 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.974465 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6570f82b-7149-41ef-8bb6-aea99de50975-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.974478 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2pmn\" (UniqueName: \"kubernetes.io/projected/6570f82b-7149-41ef-8bb6-aea99de50975-kube-api-access-g2pmn\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.987776 4962 generic.go:334] "Generic (PLEG): container finished" podID="6570f82b-7149-41ef-8bb6-aea99de50975" containerID="d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018" exitCode=137 Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.987875 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.987857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6570f82b-7149-41ef-8bb6-aea99de50975","Type":"ContainerDied","Data":"d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018"} Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.987947 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6570f82b-7149-41ef-8bb6-aea99de50975","Type":"ContainerDied","Data":"87cf1dafe8eb7ef84ea4d306941fe1f2f211b222c414d94c9c71eace20337ab8"} Oct 03 13:15:22 crc kubenswrapper[4962]: I1003 13:15:22.987980 4962 scope.go:117] "RemoveContainer" containerID="d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.008927 4962 scope.go:117] "RemoveContainer" containerID="d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018" Oct 03 13:15:23 crc kubenswrapper[4962]: E1003 13:15:23.009254 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018\": container with ID starting with d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018 not found: ID does not exist" containerID="d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.009304 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018"} err="failed to get container status \"d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018\": rpc error: code = NotFound desc = could not find container \"d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018\": container with ID starting with d5b438272fd9ea8e008747c0ff7a8b1b9b5b1b5f4fe43eb21b9d5a0e0c093018 not found: ID does not exist" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.028294 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.044708 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.060808 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:15:23 crc kubenswrapper[4962]: E1003 13:15:23.061391 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6570f82b-7149-41ef-8bb6-aea99de50975" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.061408 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6570f82b-7149-41ef-8bb6-aea99de50975" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.061750 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6570f82b-7149-41ef-8bb6-aea99de50975" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.062522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.065180 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.065333 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.065485 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.068685 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.178779 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.178898 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wrb\" (UniqueName: \"kubernetes.io/projected/85710c21-98fe-4148-8ef1-ec9f4e9ef311-kube-api-access-56wrb\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.178921 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.178946 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.178991 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.280923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.281111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wrb\" (UniqueName: \"kubernetes.io/projected/85710c21-98fe-4148-8ef1-ec9f4e9ef311-kube-api-access-56wrb\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.281147 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.281181 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.281274 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.284908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.285130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.286879 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.287119 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.298896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wrb\" (UniqueName: \"kubernetes.io/projected/85710c21-98fe-4148-8ef1-ec9f4e9ef311-kube-api-access-56wrb\") pod \"nova-cell1-novncproxy-0\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.386017 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.801853 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:15:23 crc kubenswrapper[4962]: W1003 13:15:23.806099 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85710c21_98fe_4148_8ef1_ec9f4e9ef311.slice/crio-806dd82677fa3c9d4b0fca14d063230ffd0dd47696756580d168858f6efe8e8c WatchSource:0}: Error finding container 806dd82677fa3c9d4b0fca14d063230ffd0dd47696756580d168858f6efe8e8c: Status 404 returned error can't find the container with id 806dd82677fa3c9d4b0fca14d063230ffd0dd47696756580d168858f6efe8e8c Oct 03 13:15:23 crc kubenswrapper[4962]: I1003 13:15:23.999419 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"85710c21-98fe-4148-8ef1-ec9f4e9ef311","Type":"ContainerStarted","Data":"806dd82677fa3c9d4b0fca14d063230ffd0dd47696756580d168858f6efe8e8c"} Oct 03 13:15:24 crc kubenswrapper[4962]: I1003 13:15:24.237912 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6570f82b-7149-41ef-8bb6-aea99de50975" path="/var/lib/kubelet/pods/6570f82b-7149-41ef-8bb6-aea99de50975/volumes" Oct 03 13:15:25 crc kubenswrapper[4962]: I1003 13:15:25.017318 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"85710c21-98fe-4148-8ef1-ec9f4e9ef311","Type":"ContainerStarted","Data":"cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7"} Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.255919 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.256392 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.256707 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.256733 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.260628 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.263236 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.282725 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.282706464 podStartE2EDuration="4.282706464s" podCreationTimestamp="2025-10-03 13:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:25.038883766 +0000 UTC m=+1533.442781601" watchObservedRunningTime="2025-10-03 13:15:27.282706464 +0000 UTC m=+1535.686604299" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.496040 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-94dnt"] Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.497625 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.531312 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-94dnt"] Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.562975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.563078 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb9b8\" (UniqueName: \"kubernetes.io/projected/eb190059-74a6-4ffe-88a4-5fcfd46812a0-kube-api-access-qb9b8\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.563276 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-config\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.563382 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.563466 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.563527 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.665475 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.666553 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.666654 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb9b8\" (UniqueName: \"kubernetes.io/projected/eb190059-74a6-4ffe-88a4-5fcfd46812a0-kube-api-access-qb9b8\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.666784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-config\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.666857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.666911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.666952 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.667822 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-config\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.667892 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.668055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.668120 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.687345 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb9b8\" (UniqueName: \"kubernetes.io/projected/eb190059-74a6-4ffe-88a4-5fcfd46812a0-kube-api-access-qb9b8\") pod \"dnsmasq-dns-59cf4bdb65-94dnt\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:27 crc kubenswrapper[4962]: I1003 13:15:27.815448 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:28 crc kubenswrapper[4962]: I1003 13:15:28.307202 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-94dnt"] Oct 03 13:15:28 crc kubenswrapper[4962]: I1003 13:15:28.387136 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.067097 4962 generic.go:334] "Generic (PLEG): container finished" podID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" containerID="0be4d75f8ba790fbc1d9607dd70ea29bd8aa5bfdf1d636b6a55c54afecb20827" exitCode=0 Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.067679 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" event={"ID":"eb190059-74a6-4ffe-88a4-5fcfd46812a0","Type":"ContainerDied","Data":"0be4d75f8ba790fbc1d9607dd70ea29bd8aa5bfdf1d636b6a55c54afecb20827"} Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.067734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" event={"ID":"eb190059-74a6-4ffe-88a4-5fcfd46812a0","Type":"ContainerStarted","Data":"479934aec1053530e9faa0a7258b4f7a2080621861a3348a32af527c06535648"} Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.295356 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.295646 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="ceilometer-central-agent" containerID="cri-o://b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9" gracePeriod=30 Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.295774 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="ceilometer-notification-agent" containerID="cri-o://147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b" gracePeriod=30 Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.295786 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="sg-core" containerID="cri-o://ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d" gracePeriod=30 Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.295918 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="proxy-httpd" containerID="cri-o://eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2" gracePeriod=30 Oct 03 13:15:29 crc kubenswrapper[4962]: I1003 13:15:29.300540 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.192:3000/\": EOF" Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.078484 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" event={"ID":"eb190059-74a6-4ffe-88a4-5fcfd46812a0","Type":"ContainerStarted","Data":"ad57452006db6a8d5f23250d941709e3e1778f52c203a43209011704610ba216"} Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.078931 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.080769 4962 generic.go:334] "Generic (PLEG): container finished" podID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerID="eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2" exitCode=0 Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.080799 4962 generic.go:334] "Generic (PLEG): container finished" podID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerID="ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d" exitCode=2 Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.080810 4962 generic.go:334] "Generic (PLEG): container finished" podID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerID="b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9" exitCode=0 Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.080831 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerDied","Data":"eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2"} Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.080853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerDied","Data":"ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d"} Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.080867 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerDied","Data":"b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9"} Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.094801 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" podStartSLOduration=3.094783347 podStartE2EDuration="3.094783347s" podCreationTimestamp="2025-10-03 13:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:30.093937244 +0000 UTC m=+1538.497835079" watchObservedRunningTime="2025-10-03 13:15:30.094783347 +0000 UTC m=+1538.498681182" Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.130205 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.130405 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-log" containerID="cri-o://441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca" gracePeriod=30 Oct 03 13:15:30 crc kubenswrapper[4962]: I1003 13:15:30.130479 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-api" containerID="cri-o://31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002" gracePeriod=30 Oct 03 13:15:31 crc kubenswrapper[4962]: I1003 13:15:31.091134 4962 generic.go:334] "Generic (PLEG): container finished" podID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerID="441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca" exitCode=143 Oct 03 13:15:31 crc kubenswrapper[4962]: I1003 13:15:31.091193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d699135f-ef5e-43df-8f1d-37bddf62896a","Type":"ContainerDied","Data":"441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca"} Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.233209 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:15:32 crc kubenswrapper[4962]: E1003 13:15:32.233587 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.683390 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.762409 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzbt6\" (UniqueName: \"kubernetes.io/projected/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-kube-api-access-vzbt6\") pod \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.762449 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-ceilometer-tls-certs\") pod \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.762544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-log-httpd\") pod \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.762593 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-scripts\") pod \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.762682 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-sg-core-conf-yaml\") pod \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.762713 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-combined-ca-bundle\") pod \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.762764 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-run-httpd\") pod \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.762791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-config-data\") pod \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\" (UID: \"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e\") " Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.763954 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" (UID: "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.763978 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" (UID: "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.772699 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-scripts" (OuterVolumeSpecName: "scripts") pod "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" (UID: "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.772869 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-kube-api-access-vzbt6" (OuterVolumeSpecName: "kube-api-access-vzbt6") pod "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" (UID: "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e"). InnerVolumeSpecName "kube-api-access-vzbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.788723 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" (UID: "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.813154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" (UID: "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.839311 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" (UID: "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.848764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-config-data" (OuterVolumeSpecName: "config-data") pod "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" (UID: "c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.864328 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.864359 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.864369 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.864380 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.864427 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.864435 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.864445 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzbt6\" (UniqueName: \"kubernetes.io/projected/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-kube-api-access-vzbt6\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:32 crc kubenswrapper[4962]: I1003 13:15:32.864454 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.116978 4962 generic.go:334] "Generic (PLEG): container finished" podID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerID="147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b" exitCode=0 Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.117023 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerDied","Data":"147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b"} Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.117048 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e","Type":"ContainerDied","Data":"7099cb3c5ce97811d143ef0576668b0fa1c144347635397f0a49c066eed22ed1"} Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.117065 4962 scope.go:117] "RemoveContainer" containerID="eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.117085 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.153334 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.158392 4962 scope.go:117] "RemoveContainer" containerID="ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.167912 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.176531 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:33 crc kubenswrapper[4962]: E1003 13:15:33.177105 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="ceilometer-notification-agent" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.177123 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="ceilometer-notification-agent" Oct 03 13:15:33 crc kubenswrapper[4962]: E1003 13:15:33.177145 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="proxy-httpd" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.177150 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="proxy-httpd" Oct 03 13:15:33 crc kubenswrapper[4962]: E1003 13:15:33.177162 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="ceilometer-central-agent" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.177169 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="ceilometer-central-agent" Oct 03 13:15:33 crc kubenswrapper[4962]: E1003 13:15:33.177180 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="sg-core" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.177185 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="sg-core" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.177352 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="sg-core" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.177368 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="proxy-httpd" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.177380 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="ceilometer-notification-agent" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.177393 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" containerName="ceilometer-central-agent" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.179458 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.184433 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.184612 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.189037 4962 scope.go:117] "RemoveContainer" containerID="147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.189227 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.198552 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.225508 4962 scope.go:117] "RemoveContainer" containerID="b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.246585 4962 scope.go:117] "RemoveContainer" containerID="eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2" Oct 03 13:15:33 crc kubenswrapper[4962]: E1003 13:15:33.247515 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2\": container with ID starting with eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2 not found: ID does not exist" containerID="eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.247554 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2"} err="failed to get container status \"eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2\": rpc error: code = NotFound desc = could not find container \"eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2\": container with ID starting with eb474aae149f0a4df3bc97c37daa95ac9552c67b926e36d4c2011796fef523c2 not found: ID does not exist" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.247584 4962 scope.go:117] "RemoveContainer" containerID="ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d" Oct 03 13:15:33 crc kubenswrapper[4962]: E1003 13:15:33.249067 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d\": container with ID starting with ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d not found: ID does not exist" containerID="ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.249088 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d"} err="failed to get container status \"ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d\": rpc error: code = NotFound desc = could not find container \"ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d\": container with ID starting with ddc9e981305f02b70b561994038a524d37b88d4986eb17152248c822f5abd63d not found: ID does not exist" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.249102 4962 scope.go:117] "RemoveContainer" containerID="147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b" Oct 03 13:15:33 crc kubenswrapper[4962]: E1003 13:15:33.249283 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b\": container with ID starting with 147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b not found: ID does not exist" containerID="147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.249307 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b"} err="failed to get container status \"147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b\": rpc error: code = NotFound desc = could not find container \"147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b\": container with ID starting with 147de82c1ca65a941493e0333da530b847e2e106635a47db9755ade0e2617f3b not found: ID does not exist" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.249323 4962 scope.go:117] "RemoveContainer" containerID="b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9" Oct 03 13:15:33 crc kubenswrapper[4962]: E1003 13:15:33.249503 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9\": container with ID starting with b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9 not found: ID does not exist" containerID="b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.249525 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9"} err="failed to get container status \"b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9\": rpc error: code = NotFound desc = could not find container \"b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9\": container with ID starting with b4dab9cbdec1c0b8605ca9ef4c5f06f491c80493fb9a89de8bbbe618d4cf4ab9 not found: ID does not exist" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.287537 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.287630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-run-httpd\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.287664 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-log-httpd\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.287697 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-scripts\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.287797 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-config-data\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.287843 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.287876 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hzv\" (UniqueName: \"kubernetes.io/projected/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-kube-api-access-22hzv\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.287933 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.387749 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.390397 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.390512 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22hzv\" (UniqueName: \"kubernetes.io/projected/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-kube-api-access-22hzv\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.390578 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.390609 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.390666 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-log-httpd\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.390685 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-run-httpd\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.390710 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-scripts\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.390761 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-config-data\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.391390 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-run-httpd\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.391595 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-log-httpd\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.395149 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.395282 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.395379 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-scripts\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.395469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.404346 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-config-data\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.407173 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hzv\" (UniqueName: \"kubernetes.io/projected/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-kube-api-access-22hzv\") pod \"ceilometer-0\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.416582 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.505175 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.629058 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.699081 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpc9k\" (UniqueName: \"kubernetes.io/projected/d699135f-ef5e-43df-8f1d-37bddf62896a-kube-api-access-bpc9k\") pod \"d699135f-ef5e-43df-8f1d-37bddf62896a\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.699167 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-combined-ca-bundle\") pod \"d699135f-ef5e-43df-8f1d-37bddf62896a\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.699204 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d699135f-ef5e-43df-8f1d-37bddf62896a-logs\") pod \"d699135f-ef5e-43df-8f1d-37bddf62896a\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.699234 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-config-data\") pod \"d699135f-ef5e-43df-8f1d-37bddf62896a\" (UID: \"d699135f-ef5e-43df-8f1d-37bddf62896a\") " Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.700235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d699135f-ef5e-43df-8f1d-37bddf62896a-logs" (OuterVolumeSpecName: "logs") pod "d699135f-ef5e-43df-8f1d-37bddf62896a" (UID: "d699135f-ef5e-43df-8f1d-37bddf62896a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.705340 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d699135f-ef5e-43df-8f1d-37bddf62896a-kube-api-access-bpc9k" (OuterVolumeSpecName: "kube-api-access-bpc9k") pod "d699135f-ef5e-43df-8f1d-37bddf62896a" (UID: "d699135f-ef5e-43df-8f1d-37bddf62896a"). InnerVolumeSpecName "kube-api-access-bpc9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.731432 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-config-data" (OuterVolumeSpecName: "config-data") pod "d699135f-ef5e-43df-8f1d-37bddf62896a" (UID: "d699135f-ef5e-43df-8f1d-37bddf62896a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.735921 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d699135f-ef5e-43df-8f1d-37bddf62896a" (UID: "d699135f-ef5e-43df-8f1d-37bddf62896a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.806833 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpc9k\" (UniqueName: \"kubernetes.io/projected/d699135f-ef5e-43df-8f1d-37bddf62896a-kube-api-access-bpc9k\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.806878 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.806893 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d699135f-ef5e-43df-8f1d-37bddf62896a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.806906 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d699135f-ef5e-43df-8f1d-37bddf62896a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:33 crc kubenswrapper[4962]: I1003 13:15:33.938951 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:15:33 crc kubenswrapper[4962]: W1003 13:15:33.942455 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15ad9c69_05d8_4b75_82cc_f23f6303d7d7.slice/crio-c94483489b7c42395b9d1b98c978e40f89fa9feba64d23ec0422a827060d8dac WatchSource:0}: Error finding container c94483489b7c42395b9d1b98c978e40f89fa9feba64d23ec0422a827060d8dac: Status 404 returned error can't find the container with id c94483489b7c42395b9d1b98c978e40f89fa9feba64d23ec0422a827060d8dac Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.129084 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerStarted","Data":"c94483489b7c42395b9d1b98c978e40f89fa9feba64d23ec0422a827060d8dac"} Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.134954 4962 generic.go:334] "Generic (PLEG): container finished" podID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerID="31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002" exitCode=0 Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.136431 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.144986 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d699135f-ef5e-43df-8f1d-37bddf62896a","Type":"ContainerDied","Data":"31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002"} Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.145040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d699135f-ef5e-43df-8f1d-37bddf62896a","Type":"ContainerDied","Data":"b81bb1b8c0c35d39a601737a0766b1f5b6b995072d9fb390481b834f06c1d40f"} Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.145064 4962 scope.go:117] "RemoveContainer" containerID="31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.158543 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.176876 4962 scope.go:117] "RemoveContainer" containerID="441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.218894 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.218935 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.230047 4962 scope.go:117] "RemoveContainer" containerID="31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002" Oct 03 13:15:34 crc kubenswrapper[4962]: E1003 13:15:34.230842 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002\": container with ID starting with 31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002 not found: ID does not exist" containerID="31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.230888 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002"} err="failed to get container status \"31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002\": rpc error: code = NotFound desc = could not find container \"31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002\": container with ID starting with 31034190c6f0fb85ffd2001001c9583edc1bf8c3c4320ef4261ab18054dd3002 not found: ID does not exist" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.230915 4962 scope.go:117] "RemoveContainer" containerID="441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca" Oct 03 13:15:34 crc kubenswrapper[4962]: E1003 13:15:34.231405 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca\": container with ID starting with 441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca not found: ID does not exist" containerID="441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.231433 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca"} err="failed to get container status \"441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca\": rpc error: code = NotFound desc = could not find container \"441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca\": container with ID starting with 441d9ce86b858a09b092c5d8f26aa60be9c2f67831378e82058a1d6d70c9afca not found: ID does not exist" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.259471 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e" path="/var/lib/kubelet/pods/c8aab2bd-24a7-4d0e-9a40-d2f5aa30028e/volumes" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.260443 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" path="/var/lib/kubelet/pods/d699135f-ef5e-43df-8f1d-37bddf62896a/volumes" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.261015 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:34 crc kubenswrapper[4962]: E1003 13:15:34.261302 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-log" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.261321 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-log" Oct 03 13:15:34 crc kubenswrapper[4962]: E1003 13:15:34.261337 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-api" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.261343 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-api" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.261535 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-log" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.261555 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d699135f-ef5e-43df-8f1d-37bddf62896a" containerName="nova-api-api" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.264111 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.268003 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.268180 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.268284 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.279514 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.323344 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdfh\" (UniqueName: \"kubernetes.io/projected/e52ec7fd-c43a-4bae-bffc-c63a327f248f-kube-api-access-qrdfh\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.323576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.323615 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.323678 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.323710 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52ec7fd-c43a-4bae-bffc-c63a327f248f-logs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.323749 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-config-data\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.341194 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hnv5m"] Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.342676 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.346940 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.347349 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.358194 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hnv5m"] Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425536 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-config-data\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdfh\" (UniqueName: \"kubernetes.io/projected/e52ec7fd-c43a-4bae-bffc-c63a327f248f-kube-api-access-qrdfh\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425738 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425769 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-config-data\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425814 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl7k6\" (UniqueName: \"kubernetes.io/projected/32f6cf3d-edc9-48a9-8c78-0732b6693293-kube-api-access-pl7k6\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425836 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425864 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-scripts\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.425899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52ec7fd-c43a-4bae-bffc-c63a327f248f-logs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.426426 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52ec7fd-c43a-4bae-bffc-c63a327f248f-logs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.431194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.431748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.431892 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-config-data\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.432364 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.448089 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdfh\" (UniqueName: \"kubernetes.io/projected/e52ec7fd-c43a-4bae-bffc-c63a327f248f-kube-api-access-qrdfh\") pod \"nova-api-0\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.527988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.528336 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-config-data\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.528367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl7k6\" (UniqueName: \"kubernetes.io/projected/32f6cf3d-edc9-48a9-8c78-0732b6693293-kube-api-access-pl7k6\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.528404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-scripts\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.532194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-config-data\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.532726 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.541048 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-scripts\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.543626 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl7k6\" (UniqueName: \"kubernetes.io/projected/32f6cf3d-edc9-48a9-8c78-0732b6693293-kube-api-access-pl7k6\") pod \"nova-cell1-cell-mapping-hnv5m\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.585256 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:34 crc kubenswrapper[4962]: I1003 13:15:34.697621 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:35 crc kubenswrapper[4962]: I1003 13:15:35.010518 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:35 crc kubenswrapper[4962]: W1003 13:15:35.012155 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52ec7fd_c43a_4bae_bffc_c63a327f248f.slice/crio-83cb5f3829a313918585a3a4c6d1b2545b07083264c50baf68ac86e7637491c1 WatchSource:0}: Error finding container 83cb5f3829a313918585a3a4c6d1b2545b07083264c50baf68ac86e7637491c1: Status 404 returned error can't find the container with id 83cb5f3829a313918585a3a4c6d1b2545b07083264c50baf68ac86e7637491c1 Oct 03 13:15:35 crc kubenswrapper[4962]: I1003 13:15:35.145314 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hnv5m"] Oct 03 13:15:35 crc kubenswrapper[4962]: I1003 13:15:35.153196 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerStarted","Data":"5a1f2f720e928bb6b6acb1c0a85af470d0667de88a2c7df54ede75a00de60204"} Oct 03 13:15:35 crc kubenswrapper[4962]: I1003 13:15:35.154485 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e52ec7fd-c43a-4bae-bffc-c63a327f248f","Type":"ContainerStarted","Data":"83cb5f3829a313918585a3a4c6d1b2545b07083264c50baf68ac86e7637491c1"} Oct 03 13:15:35 crc kubenswrapper[4962]: W1003 13:15:35.154898 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32f6cf3d_edc9_48a9_8c78_0732b6693293.slice/crio-15ebb5cde7b119499931979b71089dea2efc76ddd4264e019a5968be33f3d159 WatchSource:0}: Error finding container 15ebb5cde7b119499931979b71089dea2efc76ddd4264e019a5968be33f3d159: Status 404 returned error can't find the container with id 15ebb5cde7b119499931979b71089dea2efc76ddd4264e019a5968be33f3d159 Oct 03 13:15:36 crc kubenswrapper[4962]: I1003 13:15:36.167156 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hnv5m" event={"ID":"32f6cf3d-edc9-48a9-8c78-0732b6693293","Type":"ContainerStarted","Data":"78e474fd6d91ce286900fc2f5b10cfa7e118e740653453d45c91f5fde6337be2"} Oct 03 13:15:36 crc kubenswrapper[4962]: I1003 13:15:36.167696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hnv5m" event={"ID":"32f6cf3d-edc9-48a9-8c78-0732b6693293","Type":"ContainerStarted","Data":"15ebb5cde7b119499931979b71089dea2efc76ddd4264e019a5968be33f3d159"} Oct 03 13:15:36 crc kubenswrapper[4962]: I1003 13:15:36.172747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerStarted","Data":"0daf5bb19acb882ee4245cd2958c71e3acf61abcdee9f27764c1a937ef9e54d3"} Oct 03 13:15:36 crc kubenswrapper[4962]: I1003 13:15:36.175386 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e52ec7fd-c43a-4bae-bffc-c63a327f248f","Type":"ContainerStarted","Data":"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d"} Oct 03 13:15:36 crc kubenswrapper[4962]: I1003 13:15:36.175414 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e52ec7fd-c43a-4bae-bffc-c63a327f248f","Type":"ContainerStarted","Data":"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1"} Oct 03 13:15:36 crc kubenswrapper[4962]: I1003 13:15:36.185440 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hnv5m" podStartSLOduration=2.1854186110000002 podStartE2EDuration="2.185418611s" podCreationTimestamp="2025-10-03 13:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:36.183353356 +0000 UTC m=+1544.587251231" watchObservedRunningTime="2025-10-03 13:15:36.185418611 +0000 UTC m=+1544.589316456" Oct 03 13:15:36 crc kubenswrapper[4962]: I1003 13:15:36.205136 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.20511685 podStartE2EDuration="2.20511685s" podCreationTimestamp="2025-10-03 13:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:36.202157081 +0000 UTC m=+1544.606054936" watchObservedRunningTime="2025-10-03 13:15:36.20511685 +0000 UTC m=+1544.609014695" Oct 03 13:15:37 crc kubenswrapper[4962]: I1003 13:15:37.185886 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerStarted","Data":"f257bc79eebd262ca3fa0048575136a5f530237c8f848c61fcbe3df34711993b"} Oct 03 13:15:37 crc kubenswrapper[4962]: I1003 13:15:37.816796 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:15:37 crc kubenswrapper[4962]: I1003 13:15:37.882745 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-rq9zm"] Oct 03 13:15:37 crc kubenswrapper[4962]: I1003 13:15:37.883300 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" podUID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerName="dnsmasq-dns" containerID="cri-o://e13e76ed7b2b5957c583bc0dec45a956affda6e0e4eec93c073dc4e352bb1689" gracePeriod=10 Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.036835 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" podUID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.184:5353: connect: connection refused" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.207605 4962 generic.go:334] "Generic (PLEG): container finished" podID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerID="e13e76ed7b2b5957c583bc0dec45a956affda6e0e4eec93c073dc4e352bb1689" exitCode=0 Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.207943 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" event={"ID":"0260ca36-4b03-4dfc-b212-39121ca7ceb1","Type":"ContainerDied","Data":"e13e76ed7b2b5957c583bc0dec45a956affda6e0e4eec93c073dc4e352bb1689"} Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.217243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerStarted","Data":"cb12fdcf72cb818439ed6c57d7c01f490985bcb4351a8ce3800533ddd1e0259f"} Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.218247 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.242728 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.988473951 podStartE2EDuration="5.24270404s" podCreationTimestamp="2025-10-03 13:15:33 +0000 UTC" firstStartedPulling="2025-10-03 13:15:33.94448772 +0000 UTC m=+1542.348385555" lastFinishedPulling="2025-10-03 13:15:37.198717809 +0000 UTC m=+1545.602615644" observedRunningTime="2025-10-03 13:15:38.237535502 +0000 UTC m=+1546.641433357" watchObservedRunningTime="2025-10-03 13:15:38.24270404 +0000 UTC m=+1546.646601905" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.409380 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.511324 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-nb\") pod \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.511427 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-swift-storage-0\") pod \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.511466 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmw9l\" (UniqueName: \"kubernetes.io/projected/0260ca36-4b03-4dfc-b212-39121ca7ceb1-kube-api-access-lmw9l\") pod \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.511519 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-svc\") pod \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.511591 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-config\") pod \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.511650 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-sb\") pod \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\" (UID: \"0260ca36-4b03-4dfc-b212-39121ca7ceb1\") " Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.517827 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0260ca36-4b03-4dfc-b212-39121ca7ceb1-kube-api-access-lmw9l" (OuterVolumeSpecName: "kube-api-access-lmw9l") pod "0260ca36-4b03-4dfc-b212-39121ca7ceb1" (UID: "0260ca36-4b03-4dfc-b212-39121ca7ceb1"). InnerVolumeSpecName "kube-api-access-lmw9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.563613 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-config" (OuterVolumeSpecName: "config") pod "0260ca36-4b03-4dfc-b212-39121ca7ceb1" (UID: "0260ca36-4b03-4dfc-b212-39121ca7ceb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.570511 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0260ca36-4b03-4dfc-b212-39121ca7ceb1" (UID: "0260ca36-4b03-4dfc-b212-39121ca7ceb1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.570577 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0260ca36-4b03-4dfc-b212-39121ca7ceb1" (UID: "0260ca36-4b03-4dfc-b212-39121ca7ceb1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.573455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0260ca36-4b03-4dfc-b212-39121ca7ceb1" (UID: "0260ca36-4b03-4dfc-b212-39121ca7ceb1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.577629 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0260ca36-4b03-4dfc-b212-39121ca7ceb1" (UID: "0260ca36-4b03-4dfc-b212-39121ca7ceb1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.613563 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.613597 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.613620 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmw9l\" (UniqueName: \"kubernetes.io/projected/0260ca36-4b03-4dfc-b212-39121ca7ceb1-kube-api-access-lmw9l\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.613631 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.613651 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:38 crc kubenswrapper[4962]: I1003 13:15:38.613658 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0260ca36-4b03-4dfc-b212-39121ca7ceb1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:39 crc kubenswrapper[4962]: I1003 13:15:39.236800 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" Oct 03 13:15:39 crc kubenswrapper[4962]: I1003 13:15:39.237690 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-rq9zm" event={"ID":"0260ca36-4b03-4dfc-b212-39121ca7ceb1","Type":"ContainerDied","Data":"6a644532d69307b1e7fa8cd5fb3f60c3e298f374984f572787aaf71f631361db"} Oct 03 13:15:39 crc kubenswrapper[4962]: I1003 13:15:39.237749 4962 scope.go:117] "RemoveContainer" containerID="e13e76ed7b2b5957c583bc0dec45a956affda6e0e4eec93c073dc4e352bb1689" Oct 03 13:15:39 crc kubenswrapper[4962]: I1003 13:15:39.270244 4962 scope.go:117] "RemoveContainer" containerID="d1b68cda7d5a20cfec8f59c429355eeb96cc67c121429fe51cbdf95084657390" Oct 03 13:15:39 crc kubenswrapper[4962]: I1003 13:15:39.272751 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-rq9zm"] Oct 03 13:15:39 crc kubenswrapper[4962]: I1003 13:15:39.280326 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-rq9zm"] Oct 03 13:15:40 crc kubenswrapper[4962]: I1003 13:15:40.240925 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" path="/var/lib/kubelet/pods/0260ca36-4b03-4dfc-b212-39121ca7ceb1/volumes" Oct 03 13:15:41 crc kubenswrapper[4962]: I1003 13:15:41.277366 4962 generic.go:334] "Generic (PLEG): container finished" podID="32f6cf3d-edc9-48a9-8c78-0732b6693293" containerID="78e474fd6d91ce286900fc2f5b10cfa7e118e740653453d45c91f5fde6337be2" exitCode=0 Oct 03 13:15:41 crc kubenswrapper[4962]: I1003 13:15:41.277479 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hnv5m" event={"ID":"32f6cf3d-edc9-48a9-8c78-0732b6693293","Type":"ContainerDied","Data":"78e474fd6d91ce286900fc2f5b10cfa7e118e740653453d45c91f5fde6337be2"} Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.686504 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.795409 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-scripts\") pod \"32f6cf3d-edc9-48a9-8c78-0732b6693293\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.795457 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl7k6\" (UniqueName: \"kubernetes.io/projected/32f6cf3d-edc9-48a9-8c78-0732b6693293-kube-api-access-pl7k6\") pod \"32f6cf3d-edc9-48a9-8c78-0732b6693293\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.795493 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-combined-ca-bundle\") pod \"32f6cf3d-edc9-48a9-8c78-0732b6693293\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.795652 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-config-data\") pod \"32f6cf3d-edc9-48a9-8c78-0732b6693293\" (UID: \"32f6cf3d-edc9-48a9-8c78-0732b6693293\") " Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.801863 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-scripts" (OuterVolumeSpecName: "scripts") pod "32f6cf3d-edc9-48a9-8c78-0732b6693293" (UID: "32f6cf3d-edc9-48a9-8c78-0732b6693293"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.801869 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f6cf3d-edc9-48a9-8c78-0732b6693293-kube-api-access-pl7k6" (OuterVolumeSpecName: "kube-api-access-pl7k6") pod "32f6cf3d-edc9-48a9-8c78-0732b6693293" (UID: "32f6cf3d-edc9-48a9-8c78-0732b6693293"). InnerVolumeSpecName "kube-api-access-pl7k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.824917 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-config-data" (OuterVolumeSpecName: "config-data") pod "32f6cf3d-edc9-48a9-8c78-0732b6693293" (UID: "32f6cf3d-edc9-48a9-8c78-0732b6693293"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.829059 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32f6cf3d-edc9-48a9-8c78-0732b6693293" (UID: "32f6cf3d-edc9-48a9-8c78-0732b6693293"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.902245 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.902412 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.902422 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f6cf3d-edc9-48a9-8c78-0732b6693293-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:42.902432 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl7k6\" (UniqueName: \"kubernetes.io/projected/32f6cf3d-edc9-48a9-8c78-0732b6693293-kube-api-access-pl7k6\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.303371 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hnv5m" event={"ID":"32f6cf3d-edc9-48a9-8c78-0732b6693293","Type":"ContainerDied","Data":"15ebb5cde7b119499931979b71089dea2efc76ddd4264e019a5968be33f3d159"} Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.303437 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hnv5m" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.303481 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15ebb5cde7b119499931979b71089dea2efc76ddd4264e019a5968be33f3d159" Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.489381 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.489677 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerName="nova-api-log" containerID="cri-o://83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1" gracePeriod=30 Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.490157 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerName="nova-api-api" containerID="cri-o://2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d" gracePeriod=30 Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.499751 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.500071 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="21112027-f328-4fb2-b35a-1d14ac85a5ca" containerName="nova-scheduler-scheduler" containerID="cri-o://a8a6f40dc6127a6a933cf6d7715114b36ef34c0faa873e7229276b216a7ed0b8" gracePeriod=30 Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.524585 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.524814 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-log" containerID="cri-o://cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17" gracePeriod=30 Oct 03 13:15:43 crc kubenswrapper[4962]: I1003 13:15:43.525177 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-metadata" containerID="cri-o://b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd" gracePeriod=30 Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.033372 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.130672 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-public-tls-certs\") pod \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.130806 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrdfh\" (UniqueName: \"kubernetes.io/projected/e52ec7fd-c43a-4bae-bffc-c63a327f248f-kube-api-access-qrdfh\") pod \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.131002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-combined-ca-bundle\") pod \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.131040 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-config-data\") pod \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.131072 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52ec7fd-c43a-4bae-bffc-c63a327f248f-logs\") pod \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.131322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-internal-tls-certs\") pod \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\" (UID: \"e52ec7fd-c43a-4bae-bffc-c63a327f248f\") " Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.133418 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52ec7fd-c43a-4bae-bffc-c63a327f248f-logs" (OuterVolumeSpecName: "logs") pod "e52ec7fd-c43a-4bae-bffc-c63a327f248f" (UID: "e52ec7fd-c43a-4bae-bffc-c63a327f248f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.135886 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52ec7fd-c43a-4bae-bffc-c63a327f248f-kube-api-access-qrdfh" (OuterVolumeSpecName: "kube-api-access-qrdfh") pod "e52ec7fd-c43a-4bae-bffc-c63a327f248f" (UID: "e52ec7fd-c43a-4bae-bffc-c63a327f248f"). InnerVolumeSpecName "kube-api-access-qrdfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.143164 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8a6f40dc6127a6a933cf6d7715114b36ef34c0faa873e7229276b216a7ed0b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.144482 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8a6f40dc6127a6a933cf6d7715114b36ef34c0faa873e7229276b216a7ed0b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.146084 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8a6f40dc6127a6a933cf6d7715114b36ef34c0faa873e7229276b216a7ed0b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.146154 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="21112027-f328-4fb2-b35a-1d14ac85a5ca" containerName="nova-scheduler-scheduler" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.159895 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-config-data" (OuterVolumeSpecName: "config-data") pod "e52ec7fd-c43a-4bae-bffc-c63a327f248f" (UID: "e52ec7fd-c43a-4bae-bffc-c63a327f248f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.161840 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e52ec7fd-c43a-4bae-bffc-c63a327f248f" (UID: "e52ec7fd-c43a-4bae-bffc-c63a327f248f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.183363 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e52ec7fd-c43a-4bae-bffc-c63a327f248f" (UID: "e52ec7fd-c43a-4bae-bffc-c63a327f248f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.191810 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e52ec7fd-c43a-4bae-bffc-c63a327f248f" (UID: "e52ec7fd-c43a-4bae-bffc-c63a327f248f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.234264 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.234316 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.234359 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrdfh\" (UniqueName: \"kubernetes.io/projected/e52ec7fd-c43a-4bae-bffc-c63a327f248f-kube-api-access-qrdfh\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.234374 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.234389 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52ec7fd-c43a-4bae-bffc-c63a327f248f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.234400 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52ec7fd-c43a-4bae-bffc-c63a327f248f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.314984 4962 generic.go:334] "Generic (PLEG): container finished" podID="870a8837-baa4-44c0-a740-32468cee8d28" containerID="cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17" exitCode=143 Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.315063 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870a8837-baa4-44c0-a740-32468cee8d28","Type":"ContainerDied","Data":"cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17"} Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.316840 4962 generic.go:334] "Generic (PLEG): container finished" podID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerID="2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d" exitCode=0 Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.316868 4962 generic.go:334] "Generic (PLEG): container finished" podID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerID="83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1" exitCode=143 Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.316887 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e52ec7fd-c43a-4bae-bffc-c63a327f248f","Type":"ContainerDied","Data":"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d"} Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.316907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e52ec7fd-c43a-4bae-bffc-c63a327f248f","Type":"ContainerDied","Data":"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1"} Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.316920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e52ec7fd-c43a-4bae-bffc-c63a327f248f","Type":"ContainerDied","Data":"83cb5f3829a313918585a3a4c6d1b2545b07083264c50baf68ac86e7637491c1"} Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.316940 4962 scope.go:117] "RemoveContainer" containerID="2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.317078 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.347509 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.356955 4962 scope.go:117] "RemoveContainer" containerID="83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.366554 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.380524 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.381007 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerName="nova-api-api" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381029 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerName="nova-api-api" Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.381046 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerName="nova-api-log" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381057 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerName="nova-api-log" Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.381076 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerName="dnsmasq-dns" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381086 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerName="dnsmasq-dns" Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.381115 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f6cf3d-edc9-48a9-8c78-0732b6693293" containerName="nova-manage" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381123 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f6cf3d-edc9-48a9-8c78-0732b6693293" containerName="nova-manage" Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.381158 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerName="init" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381166 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerName="init" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381664 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f6cf3d-edc9-48a9-8c78-0732b6693293" containerName="nova-manage" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381693 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerName="nova-api-log" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381715 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0260ca36-4b03-4dfc-b212-39121ca7ceb1" containerName="dnsmasq-dns" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.381739 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" containerName="nova-api-api" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.382994 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.394135 4962 scope.go:117] "RemoveContainer" containerID="2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.394564 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.395747 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.396014 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.397861 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d\": container with ID starting with 2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d not found: ID does not exist" containerID="2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.398008 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d"} err="failed to get container status \"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d\": rpc error: code = NotFound desc = could not find container \"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d\": container with ID starting with 2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d not found: ID does not exist" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.398106 4962 scope.go:117] "RemoveContainer" containerID="83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.412842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:44 crc kubenswrapper[4962]: E1003 13:15:44.413438 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1\": container with ID starting with 83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1 not found: ID does not exist" containerID="83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.413695 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1"} err="failed to get container status \"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1\": rpc error: code = NotFound desc = could not find container \"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1\": container with ID starting with 83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1 not found: ID does not exist" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.413797 4962 scope.go:117] "RemoveContainer" containerID="2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.420095 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d"} err="failed to get container status \"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d\": rpc error: code = NotFound desc = could not find container \"2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d\": container with ID starting with 2e1f4937f56a133891958f5beaf35e7e0f634307db69c2ac32e12de71ad5f95d not found: ID does not exist" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.420209 4962 scope.go:117] "RemoveContainer" containerID="83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.420925 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1"} err="failed to get container status \"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1\": rpc error: code = NotFound desc = could not find container \"83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1\": container with ID starting with 83042c652f4037413abf62ed9c764c49bde0351ee466150498a8dd507a3d6bf1 not found: ID does not exist" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.540992 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-config-data\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.541401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.541440 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-public-tls-certs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.541482 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.541511 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d329c4da-aa05-4c80-ab30-622eac56428a-logs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.541567 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjthg\" (UniqueName: \"kubernetes.io/projected/d329c4da-aa05-4c80-ab30-622eac56428a-kube-api-access-xjthg\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.643772 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.643833 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-public-tls-certs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.643886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.643916 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d329c4da-aa05-4c80-ab30-622eac56428a-logs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.644471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d329c4da-aa05-4c80-ab30-622eac56428a-logs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.645053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjthg\" (UniqueName: \"kubernetes.io/projected/d329c4da-aa05-4c80-ab30-622eac56428a-kube-api-access-xjthg\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.645151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-config-data\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.651203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.651247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-config-data\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.652145 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-public-tls-certs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.652782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.666864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjthg\" (UniqueName: \"kubernetes.io/projected/d329c4da-aa05-4c80-ab30-622eac56428a-kube-api-access-xjthg\") pod \"nova-api-0\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " pod="openstack/nova-api-0" Oct 03 13:15:44 crc kubenswrapper[4962]: I1003 13:15:44.724246 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:15:45 crc kubenswrapper[4962]: I1003 13:15:45.136292 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:15:45 crc kubenswrapper[4962]: W1003 13:15:45.140741 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd329c4da_aa05_4c80_ab30_622eac56428a.slice/crio-b2f488bcb242ea953b34e7a282f39d58dc3b72f8a69993c23384f95155a5300f WatchSource:0}: Error finding container b2f488bcb242ea953b34e7a282f39d58dc3b72f8a69993c23384f95155a5300f: Status 404 returned error can't find the container with id b2f488bcb242ea953b34e7a282f39d58dc3b72f8a69993c23384f95155a5300f Oct 03 13:15:45 crc kubenswrapper[4962]: I1003 13:15:45.227329 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:15:45 crc kubenswrapper[4962]: E1003 13:15:45.227672 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:15:45 crc kubenswrapper[4962]: I1003 13:15:45.331250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d329c4da-aa05-4c80-ab30-622eac56428a","Type":"ContainerStarted","Data":"024390e769ef5aec351b0a25e277cb3becb27c08b4c0917b93bfe0e1e2975164"} Oct 03 13:15:45 crc kubenswrapper[4962]: I1003 13:15:45.331457 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d329c4da-aa05-4c80-ab30-622eac56428a","Type":"ContainerStarted","Data":"b2f488bcb242ea953b34e7a282f39d58dc3b72f8a69993c23384f95155a5300f"} Oct 03 13:15:46 crc kubenswrapper[4962]: I1003 13:15:46.251336 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52ec7fd-c43a-4bae-bffc-c63a327f248f" path="/var/lib/kubelet/pods/e52ec7fd-c43a-4bae-bffc-c63a327f248f/volumes" Oct 03 13:15:46 crc kubenswrapper[4962]: I1003 13:15:46.343698 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d329c4da-aa05-4c80-ab30-622eac56428a","Type":"ContainerStarted","Data":"05b9f6f45511688aab50c27ba2558d43869979d7395ad2e599ba9c06301661e8"} Oct 03 13:15:46 crc kubenswrapper[4962]: I1003 13:15:46.376032 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.376010351 podStartE2EDuration="2.376010351s" podCreationTimestamp="2025-10-03 13:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:46.368823468 +0000 UTC m=+1554.772721313" watchObservedRunningTime="2025-10-03 13:15:46.376010351 +0000 UTC m=+1554.779908196" Oct 03 13:15:46 crc kubenswrapper[4962]: I1003 13:15:46.681578 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:44538->10.217.0.189:8775: read: connection reset by peer" Oct 03 13:15:46 crc kubenswrapper[4962]: I1003 13:15:46.681869 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:44522->10.217.0.189:8775: read: connection reset by peer" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.224193 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.298321 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-config-data\") pod \"870a8837-baa4-44c0-a740-32468cee8d28\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.298399 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsdp6\" (UniqueName: \"kubernetes.io/projected/870a8837-baa4-44c0-a740-32468cee8d28-kube-api-access-rsdp6\") pod \"870a8837-baa4-44c0-a740-32468cee8d28\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.298471 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870a8837-baa4-44c0-a740-32468cee8d28-logs\") pod \"870a8837-baa4-44c0-a740-32468cee8d28\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.298499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-nova-metadata-tls-certs\") pod \"870a8837-baa4-44c0-a740-32468cee8d28\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.298536 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-combined-ca-bundle\") pod \"870a8837-baa4-44c0-a740-32468cee8d28\" (UID: \"870a8837-baa4-44c0-a740-32468cee8d28\") " Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.299238 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870a8837-baa4-44c0-a740-32468cee8d28-logs" (OuterVolumeSpecName: "logs") pod "870a8837-baa4-44c0-a740-32468cee8d28" (UID: "870a8837-baa4-44c0-a740-32468cee8d28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.299996 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870a8837-baa4-44c0-a740-32468cee8d28-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.305786 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870a8837-baa4-44c0-a740-32468cee8d28-kube-api-access-rsdp6" (OuterVolumeSpecName: "kube-api-access-rsdp6") pod "870a8837-baa4-44c0-a740-32468cee8d28" (UID: "870a8837-baa4-44c0-a740-32468cee8d28"). InnerVolumeSpecName "kube-api-access-rsdp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.344514 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "870a8837-baa4-44c0-a740-32468cee8d28" (UID: "870a8837-baa4-44c0-a740-32468cee8d28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.367772 4962 generic.go:334] "Generic (PLEG): container finished" podID="870a8837-baa4-44c0-a740-32468cee8d28" containerID="b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd" exitCode=0 Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.368771 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.370799 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870a8837-baa4-44c0-a740-32468cee8d28","Type":"ContainerDied","Data":"b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd"} Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.370840 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870a8837-baa4-44c0-a740-32468cee8d28","Type":"ContainerDied","Data":"565a8e6dbb4986224b8f1b15f699085bdcd0496de1cd19fb57eb5bed1dd78949"} Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.370862 4962 scope.go:117] "RemoveContainer" containerID="b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.382781 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-config-data" (OuterVolumeSpecName: "config-data") pod "870a8837-baa4-44c0-a740-32468cee8d28" (UID: "870a8837-baa4-44c0-a740-32468cee8d28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.382906 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "870a8837-baa4-44c0-a740-32468cee8d28" (UID: "870a8837-baa4-44c0-a740-32468cee8d28"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.408746 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsdp6\" (UniqueName: \"kubernetes.io/projected/870a8837-baa4-44c0-a740-32468cee8d28-kube-api-access-rsdp6\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.408779 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.408788 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.408801 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870a8837-baa4-44c0-a740-32468cee8d28-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.434292 4962 scope.go:117] "RemoveContainer" containerID="cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.461786 4962 scope.go:117] "RemoveContainer" containerID="b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd" Oct 03 13:15:47 crc kubenswrapper[4962]: E1003 13:15:47.462212 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd\": container with ID starting with b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd not found: ID does not exist" containerID="b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.462243 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd"} err="failed to get container status \"b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd\": rpc error: code = NotFound desc = could not find container \"b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd\": container with ID starting with b1ff4e9661dfc352561a1c4f50b9451aa8310173ac9d8a6a8048800aab24a0bd not found: ID does not exist" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.462262 4962 scope.go:117] "RemoveContainer" containerID="cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17" Oct 03 13:15:47 crc kubenswrapper[4962]: E1003 13:15:47.462531 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17\": container with ID starting with cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17 not found: ID does not exist" containerID="cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.462570 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17"} err="failed to get container status \"cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17\": rpc error: code = NotFound desc = could not find container \"cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17\": container with ID starting with cd97b10a939c3080e4854ca11db0af3e71ed9e191670fb384068096bf40f0d17 not found: ID does not exist" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.705406 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.715071 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.728877 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:47 crc kubenswrapper[4962]: E1003 13:15:47.729463 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-metadata" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.729533 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-metadata" Oct 03 13:15:47 crc kubenswrapper[4962]: E1003 13:15:47.729614 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-log" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.729732 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-log" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.729970 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-log" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.730033 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="870a8837-baa4-44c0-a740-32468cee8d28" containerName="nova-metadata-metadata" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.731369 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.733754 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.733980 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.746708 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.815273 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-config-data\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.815334 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz9nn\" (UniqueName: \"kubernetes.io/projected/dd269d6d-5aa2-43c0-a23b-e76b52699d59-kube-api-access-dz9nn\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.815361 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd269d6d-5aa2-43c0-a23b-e76b52699d59-logs\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.815387 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.815425 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.917155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-config-data\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.917523 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz9nn\" (UniqueName: \"kubernetes.io/projected/dd269d6d-5aa2-43c0-a23b-e76b52699d59-kube-api-access-dz9nn\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.917768 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd269d6d-5aa2-43c0-a23b-e76b52699d59-logs\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.917927 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.918105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.918289 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd269d6d-5aa2-43c0-a23b-e76b52699d59-logs\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.923020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.926816 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.929873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-config-data\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:47 crc kubenswrapper[4962]: I1003 13:15:47.935086 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz9nn\" (UniqueName: \"kubernetes.io/projected/dd269d6d-5aa2-43c0-a23b-e76b52699d59-kube-api-access-dz9nn\") pod \"nova-metadata-0\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " pod="openstack/nova-metadata-0" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.046783 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.246816 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870a8837-baa4-44c0-a740-32468cee8d28" path="/var/lib/kubelet/pods/870a8837-baa4-44c0-a740-32468cee8d28/volumes" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.380983 4962 generic.go:334] "Generic (PLEG): container finished" podID="21112027-f328-4fb2-b35a-1d14ac85a5ca" containerID="a8a6f40dc6127a6a933cf6d7715114b36ef34c0faa873e7229276b216a7ed0b8" exitCode=0 Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.381083 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21112027-f328-4fb2-b35a-1d14ac85a5ca","Type":"ContainerDied","Data":"a8a6f40dc6127a6a933cf6d7715114b36ef34c0faa873e7229276b216a7ed0b8"} Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.382977 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.437543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-combined-ca-bundle\") pod \"21112027-f328-4fb2-b35a-1d14ac85a5ca\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.438070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8cjf\" (UniqueName: \"kubernetes.io/projected/21112027-f328-4fb2-b35a-1d14ac85a5ca-kube-api-access-f8cjf\") pod \"21112027-f328-4fb2-b35a-1d14ac85a5ca\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.438284 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-config-data\") pod \"21112027-f328-4fb2-b35a-1d14ac85a5ca\" (UID: \"21112027-f328-4fb2-b35a-1d14ac85a5ca\") " Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.443794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21112027-f328-4fb2-b35a-1d14ac85a5ca-kube-api-access-f8cjf" (OuterVolumeSpecName: "kube-api-access-f8cjf") pod "21112027-f328-4fb2-b35a-1d14ac85a5ca" (UID: "21112027-f328-4fb2-b35a-1d14ac85a5ca"). InnerVolumeSpecName "kube-api-access-f8cjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.467870 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-config-data" (OuterVolumeSpecName: "config-data") pod "21112027-f328-4fb2-b35a-1d14ac85a5ca" (UID: "21112027-f328-4fb2-b35a-1d14ac85a5ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.470677 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21112027-f328-4fb2-b35a-1d14ac85a5ca" (UID: "21112027-f328-4fb2-b35a-1d14ac85a5ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.540910 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.540939 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8cjf\" (UniqueName: \"kubernetes.io/projected/21112027-f328-4fb2-b35a-1d14ac85a5ca-kube-api-access-f8cjf\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.540950 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21112027-f328-4fb2-b35a-1d14ac85a5ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:15:48 crc kubenswrapper[4962]: I1003 13:15:48.583914 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.391613 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21112027-f328-4fb2-b35a-1d14ac85a5ca","Type":"ContainerDied","Data":"784acff432d9d0ceea2eecb2fcde0508ab251f4cb5aeed569bc7f1f4de9d1a2a"} Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.391703 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.392003 4962 scope.go:117] "RemoveContainer" containerID="a8a6f40dc6127a6a933cf6d7715114b36ef34c0faa873e7229276b216a7ed0b8" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.393756 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd269d6d-5aa2-43c0-a23b-e76b52699d59","Type":"ContainerStarted","Data":"3e39d9d9f9ef98752b8331bca21aeea43f0d1658741d17c5c7ad0d95aa684075"} Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.393797 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd269d6d-5aa2-43c0-a23b-e76b52699d59","Type":"ContainerStarted","Data":"ae65644f1732d90a709b491cf3a8a7c7ca3acd9268609750aa5824cea960f345"} Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.393808 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd269d6d-5aa2-43c0-a23b-e76b52699d59","Type":"ContainerStarted","Data":"066b514190e18f43574fae09b13806ce1c06fb24598a4d1db574de6345243bdc"} Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.411367 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.41135095 podStartE2EDuration="2.41135095s" podCreationTimestamp="2025-10-03 13:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:49.409711236 +0000 UTC m=+1557.813609091" watchObservedRunningTime="2025-10-03 13:15:49.41135095 +0000 UTC m=+1557.815248785" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.432720 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.456346 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.465673 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:49 crc kubenswrapper[4962]: E1003 13:15:49.466144 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21112027-f328-4fb2-b35a-1d14ac85a5ca" containerName="nova-scheduler-scheduler" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.466166 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="21112027-f328-4fb2-b35a-1d14ac85a5ca" containerName="nova-scheduler-scheduler" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.466423 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="21112027-f328-4fb2-b35a-1d14ac85a5ca" containerName="nova-scheduler-scheduler" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.467316 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.479220 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.479779 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.662345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.662471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-config-data\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.662536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xb25\" (UniqueName: \"kubernetes.io/projected/d36308a0-1b17-4986-adb2-2833b444a239-kube-api-access-5xb25\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.764194 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.764310 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-config-data\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.764351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xb25\" (UniqueName: \"kubernetes.io/projected/d36308a0-1b17-4986-adb2-2833b444a239-kube-api-access-5xb25\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.769973 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-config-data\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.770767 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.779661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xb25\" (UniqueName: \"kubernetes.io/projected/d36308a0-1b17-4986-adb2-2833b444a239-kube-api-access-5xb25\") pod \"nova-scheduler-0\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " pod="openstack/nova-scheduler-0" Oct 03 13:15:49 crc kubenswrapper[4962]: I1003 13:15:49.796597 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:15:50 crc kubenswrapper[4962]: I1003 13:15:50.219111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:15:50 crc kubenswrapper[4962]: W1003 13:15:50.222257 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd36308a0_1b17_4986_adb2_2833b444a239.slice/crio-6af4999c095bb3a40cc8d568790a2ef5ff6b6f5d269b92f1cd85d0e38a47af5d WatchSource:0}: Error finding container 6af4999c095bb3a40cc8d568790a2ef5ff6b6f5d269b92f1cd85d0e38a47af5d: Status 404 returned error can't find the container with id 6af4999c095bb3a40cc8d568790a2ef5ff6b6f5d269b92f1cd85d0e38a47af5d Oct 03 13:15:50 crc kubenswrapper[4962]: I1003 13:15:50.242553 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21112027-f328-4fb2-b35a-1d14ac85a5ca" path="/var/lib/kubelet/pods/21112027-f328-4fb2-b35a-1d14ac85a5ca/volumes" Oct 03 13:15:50 crc kubenswrapper[4962]: I1003 13:15:50.405133 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d36308a0-1b17-4986-adb2-2833b444a239","Type":"ContainerStarted","Data":"d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3"} Oct 03 13:15:50 crc kubenswrapper[4962]: I1003 13:15:50.405190 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d36308a0-1b17-4986-adb2-2833b444a239","Type":"ContainerStarted","Data":"6af4999c095bb3a40cc8d568790a2ef5ff6b6f5d269b92f1cd85d0e38a47af5d"} Oct 03 13:15:50 crc kubenswrapper[4962]: I1003 13:15:50.426449 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.426432624 podStartE2EDuration="1.426432624s" podCreationTimestamp="2025-10-03 13:15:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:15:50.418709707 +0000 UTC m=+1558.822607542" watchObservedRunningTime="2025-10-03 13:15:50.426432624 +0000 UTC m=+1558.830330459" Oct 03 13:15:53 crc kubenswrapper[4962]: I1003 13:15:53.046903 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 13:15:53 crc kubenswrapper[4962]: I1003 13:15:53.047483 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 13:15:54 crc kubenswrapper[4962]: I1003 13:15:54.725562 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 13:15:54 crc kubenswrapper[4962]: I1003 13:15:54.725855 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 13:15:54 crc kubenswrapper[4962]: I1003 13:15:54.797142 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 13:15:55 crc kubenswrapper[4962]: I1003 13:15:55.739408 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 13:15:55 crc kubenswrapper[4962]: I1003 13:15:55.741553 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 13:15:58 crc kubenswrapper[4962]: I1003 13:15:58.047560 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 13:15:58 crc kubenswrapper[4962]: I1003 13:15:58.048832 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 13:15:58 crc kubenswrapper[4962]: I1003 13:15:58.226897 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:15:58 crc kubenswrapper[4962]: E1003 13:15:58.227133 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:15:59 crc kubenswrapper[4962]: I1003 13:15:59.064836 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 13:15:59 crc kubenswrapper[4962]: I1003 13:15:59.064849 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 13:15:59 crc kubenswrapper[4962]: I1003 13:15:59.796849 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 13:15:59 crc kubenswrapper[4962]: I1003 13:15:59.827785 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 13:16:00 crc kubenswrapper[4962]: I1003 13:16:00.562314 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 13:16:03 crc kubenswrapper[4962]: I1003 13:16:03.514335 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 13:16:04 crc kubenswrapper[4962]: I1003 13:16:04.735082 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 13:16:04 crc kubenswrapper[4962]: I1003 13:16:04.736317 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 13:16:04 crc kubenswrapper[4962]: I1003 13:16:04.737539 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 13:16:04 crc kubenswrapper[4962]: I1003 13:16:04.743045 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 13:16:05 crc kubenswrapper[4962]: I1003 13:16:05.596990 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 13:16:05 crc kubenswrapper[4962]: I1003 13:16:05.677066 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 13:16:08 crc kubenswrapper[4962]: I1003 13:16:08.052341 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 13:16:08 crc kubenswrapper[4962]: I1003 13:16:08.053195 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 13:16:08 crc kubenswrapper[4962]: I1003 13:16:08.057347 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 13:16:08 crc kubenswrapper[4962]: I1003 13:16:08.627761 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 13:16:13 crc kubenswrapper[4962]: I1003 13:16:13.228144 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:16:13 crc kubenswrapper[4962]: E1003 13:16:13.228742 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:16:27 crc kubenswrapper[4962]: I1003 13:16:27.227729 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:16:27 crc kubenswrapper[4962]: E1003 13:16:27.228494 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:16:28 crc kubenswrapper[4962]: I1003 13:16:28.988883 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:16:28 crc kubenswrapper[4962]: I1003 13:16:28.989369 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerName="cinder-scheduler" containerID="cri-o://8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad" gracePeriod=30 Oct 03 13:16:28 crc kubenswrapper[4962]: I1003 13:16:28.989857 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerName="probe" containerID="cri-o://ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261" gracePeriod=30 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.005972 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.030993 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.031279 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api-log" containerID="cri-o://1ffd9ed0756445b7f118da7e647ddacf67d26cbedb3b89a2b3074c6bedfe80b2" gracePeriod=30 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.031723 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api" containerID="cri-o://5338fa6e89e4bd15f9e4db940f00a7a5b8fdf7c12ab81b1de2fe6747f81ea20d" gracePeriod=30 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.054561 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-c97f5c65f-s279k"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.056256 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.088385 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": EOF" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.089482 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-59bf856dfd-t86xg"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.091129 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.136345 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c97f5c65f-s279k"] Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.171534 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.171607 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data podName:221bdd26-0fec-49e5-86ec-c2aefe7a5902 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:29.671585629 +0000 UTC m=+1598.075483464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data") pod "rabbitmq-cell1-server-0" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902") : configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.225706 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-59bf856dfd-t86xg"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.274680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-logs\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.274742 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.274804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9xsp\" (UniqueName: \"kubernetes.io/projected/0ae87940-f07d-4213-bc0b-da0b3a2bba84-kube-api-access-l9xsp\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.274845 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data-custom\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.274878 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-combined-ca-bundle\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.274972 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.275016 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae87940-f07d-4213-bc0b-da0b3a2bba84-logs\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.275066 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cr49\" (UniqueName: \"kubernetes.io/projected/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-kube-api-access-7cr49\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.275119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-combined-ca-bundle\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.275145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.275295 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.275526 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="75210a15-c36f-4be9-9709-ceb4eb2c4646" containerName="openstackclient" containerID="cri-o://03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9" gracePeriod=2 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.332730 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.363993 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement453e-account-delete-v68p5"] Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.364421 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75210a15-c36f-4be9-9709-ceb4eb2c4646" containerName="openstackclient" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.364438 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="75210a15-c36f-4be9-9709-ceb4eb2c4646" containerName="openstackclient" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.364666 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="75210a15-c36f-4be9-9709-ceb4eb2c4646" containerName="openstackclient" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.365433 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement453e-account-delete-v68p5" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.378566 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.378963 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="openstack-network-exporter" containerID="cri-o://d2fb6e730baadf5cce5c8d3a7e70507b921c0b75296b297850b5803cf0722b8a" gracePeriod=300 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380277 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9xsp\" (UniqueName: \"kubernetes.io/projected/0ae87940-f07d-4213-bc0b-da0b3a2bba84-kube-api-access-l9xsp\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380330 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data-custom\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-combined-ca-bundle\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae87940-f07d-4213-bc0b-da0b3a2bba84-logs\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cr49\" (UniqueName: \"kubernetes.io/projected/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-kube-api-access-7cr49\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380557 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-combined-ca-bundle\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380578 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380613 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-logs\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.380660 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.392199 4962 configmap.go:193] Couldn't get configMap openstack/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.392267 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts podName:6d6f62dd-0720-46b6-b0a8-497490f052a8 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:29.892246676 +0000 UTC m=+1598.296144501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts") pod "ovn-controller-6sqdm" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8") : configmap "ovncontroller-scripts" not found Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.392952 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae87940-f07d-4213-bc0b-da0b3a2bba84-logs\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.394203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-logs\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.422697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-combined-ca-bundle\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.423955 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.436770 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data-custom\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.441297 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.441337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-combined-ca-bundle\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.442161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cr49\" (UniqueName: \"kubernetes.io/projected/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-kube-api-access-7cr49\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.443138 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom\") pod \"barbican-keystone-listener-59bf856dfd-t86xg\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.455328 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement453e-account-delete-v68p5"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.476312 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9xsp\" (UniqueName: \"kubernetes.io/projected/0ae87940-f07d-4213-bc0b-da0b3a2bba84-kube-api-access-l9xsp\") pod \"barbican-worker-c97f5c65f-s279k\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.483159 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv9w7\" (UniqueName: \"kubernetes.io/projected/b42a368b-6dd4-4bb0-83a8-d79138605ec9-kube-api-access-sv9w7\") pod \"placement453e-account-delete-v68p5\" (UID: \"b42a368b-6dd4-4bb0-83a8-d79138605ec9\") " pod="openstack/placement453e-account-delete-v68p5" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.495742 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c44799d88-mmmm6"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.500835 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.549533 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c44799d88-mmmm6"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.584997 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab0e7ec-9c64-491d-a655-027098042378-logs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.585033 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.585063 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.585082 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-combined-ca-bundle\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.585114 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-internal-tls-certs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.585166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv9w7\" (UniqueName: \"kubernetes.io/projected/b42a368b-6dd4-4bb0-83a8-d79138605ec9-kube-api-access-sv9w7\") pod \"placement453e-account-delete-v68p5\" (UID: \"b42a368b-6dd4-4bb0-83a8-d79138605ec9\") " pod="openstack/placement453e-account-delete-v68p5" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.585217 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.585239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-public-tls-certs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.597659 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance2f72-account-delete-wdbvb"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.599265 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance2f72-account-delete-wdbvb" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.621707 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.622153 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerName="openstack-network-exporter" containerID="cri-o://706277ad3b96d3b0d8687160b11c07b80babd5d0827c39c2e0bc9cc9e42f7d03" gracePeriod=300 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.632304 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance2f72-account-delete-wdbvb"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.636621 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv9w7\" (UniqueName: \"kubernetes.io/projected/b42a368b-6dd4-4bb0-83a8-d79138605ec9-kube-api-access-sv9w7\") pod \"placement453e-account-delete-v68p5\" (UID: \"b42a368b-6dd4-4bb0-83a8-d79138605ec9\") " pod="openstack/placement453e-account-delete-v68p5" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.646323 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.646525 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="ovn-northd" containerID="cri-o://d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" gracePeriod=30 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.646654 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="openstack-network-exporter" containerID="cri-o://c10da2ac06df5b8b854f495ec36dfbffd1281d8e886e7d01348cf8b99da08700" gracePeriod=30 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.687145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npxmv\" (UniqueName: \"kubernetes.io/projected/cfeca1b1-fa87-4490-9e99-38e60d421138-kube-api-access-npxmv\") pod \"glance2f72-account-delete-wdbvb\" (UID: \"cfeca1b1-fa87-4490-9e99-38e60d421138\") " pod="openstack/glance2f72-account-delete-wdbvb" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.687212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab0e7ec-9c64-491d-a655-027098042378-logs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.687238 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.687267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.687283 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-combined-ca-bundle\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.687316 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-internal-tls-certs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.687385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.687406 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-public-tls-certs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.695061 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab0e7ec-9c64-491d-a655-027098042378-logs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.695523 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.695582 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:30.195565702 +0000 UTC m=+1598.599463537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-config-data" not found Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.695702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-public-tls-certs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.695767 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.695794 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data podName:221bdd26-0fec-49e5-86ec-c2aefe7a5902 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:30.695784898 +0000 UTC m=+1599.099682733 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data") pod "rabbitmq-cell1-server-0" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902") : configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.705468 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.707079 4962 projected.go:194] Error preparing data for projected volume kube-api-access-9bx2f for pod openstack/barbican-api-7c44799d88-mmmm6: failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.707130 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:30.207112772 +0000 UTC m=+1598.611010607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9bx2f" (UniqueName: "kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.707577 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-combined-ca-bundle\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.789550 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxmv\" (UniqueName: \"kubernetes.io/projected/cfeca1b1-fa87-4490-9e99-38e60d421138-kube-api-access-npxmv\") pod \"glance2f72-account-delete-wdbvb\" (UID: \"cfeca1b1-fa87-4490-9e99-38e60d421138\") " pod="openstack/glance2f72-account-delete-wdbvb" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.797924 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron8f54-account-delete-pznh9"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.802837 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-internal-tls-certs\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.804010 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron8f54-account-delete-pznh9" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.837923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.849206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxmv\" (UniqueName: \"kubernetes.io/projected/cfeca1b1-fa87-4490-9e99-38e60d421138-kube-api-access-npxmv\") pod \"glance2f72-account-delete-wdbvb\" (UID: \"cfeca1b1-fa87-4490-9e99-38e60d421138\") " pod="openstack/glance2f72-account-delete-wdbvb" Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.889299 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron8f54-account-delete-pznh9"] Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.893744 4962 configmap.go:193] Couldn't get configMap openstack/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.893825 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts podName:6d6f62dd-0720-46b6-b0a8-497490f052a8 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:30.893810577 +0000 UTC m=+1599.297708412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts") pod "ovn-controller-6sqdm" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8") : configmap "ovncontroller-scripts" not found Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.897368 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="ovsdbserver-sb" containerID="cri-o://40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85" gracePeriod=300 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.899018 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerName="ovsdbserver-nb" containerID="cri-o://cc095d6f6d8b5824a32ad688e66fd5a34700bd68a8048d5bb8c9727930860221" gracePeriod=300 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.925053 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2865v"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.942204 4962 generic.go:334] "Generic (PLEG): container finished" podID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerID="c10da2ac06df5b8b854f495ec36dfbffd1281d8e886e7d01348cf8b99da08700" exitCode=2 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.942294 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695","Type":"ContainerDied","Data":"c10da2ac06df5b8b854f495ec36dfbffd1281d8e886e7d01348cf8b99da08700"} Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.946706 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2865v"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.951335 4962 generic.go:334] "Generic (PLEG): container finished" podID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerID="706277ad3b96d3b0d8687160b11c07b80babd5d0827c39c2e0bc9cc9e42f7d03" exitCode=2 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.951622 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2af174c7-cf23-452c-bc13-ecda2775d58d","Type":"ContainerDied","Data":"706277ad3b96d3b0d8687160b11c07b80babd5d0827c39c2e0bc9cc9e42f7d03"} Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.978435 4962 generic.go:334] "Generic (PLEG): container finished" podID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerID="1ffd9ed0756445b7f118da7e647ddacf67d26cbedb3b89a2b3074c6bedfe80b2" exitCode=143 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.978528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6ae29e17-1d99-4401-a317-9c8b7be58a3c","Type":"ContainerDied","Data":"1ffd9ed0756445b7f118da7e647ddacf67d26cbedb3b89a2b3074c6bedfe80b2"} Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.988007 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder4662-account-delete-kl2sp"] Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.993805 4962 generic.go:334] "Generic (PLEG): container finished" podID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerID="d2fb6e730baadf5cce5c8d3a7e70507b921c0b75296b297850b5803cf0722b8a" exitCode=2 Oct 03 13:16:29 crc kubenswrapper[4962]: I1003 13:16:29.994902 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v8rg\" (UniqueName: \"kubernetes.io/projected/56923e91-36c0-432d-8042-138d2e89eb3b-kube-api-access-6v8rg\") pod \"neutron8f54-account-delete-pznh9\" (UID: \"56923e91-36c0-432d-8042-138d2e89eb3b\") " pod="openstack/neutron8f54-account-delete-pznh9" Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.996108 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:29 crc kubenswrapper[4962]: E1003 13:16:29.996361 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data podName:0ae87940-f07d-4213-bc0b-da0b3a2bba84 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:30.496346241 +0000 UTC m=+1598.900244076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data") pod "barbican-worker-c97f5c65f-s279k" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84") : secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.002194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6313803e-1bf1-4a99-8af7-cb80c0e6321c","Type":"ContainerDied","Data":"d2fb6e730baadf5cce5c8d3a7e70507b921c0b75296b297850b5803cf0722b8a"} Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.002315 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4662-account-delete-kl2sp" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.034405 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wrqgr"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.089817 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wrqgr"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.097142 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v8rg\" (UniqueName: \"kubernetes.io/projected/56923e91-36c0-432d-8042-138d2e89eb3b-kube-api-access-6v8rg\") pod \"neutron8f54-account-delete-pznh9\" (UID: \"56923e91-36c0-432d-8042-138d2e89eb3b\") " pod="openstack/neutron8f54-account-delete-pznh9" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.097241 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fb8j\" (UniqueName: \"kubernetes.io/projected/8e098e6f-ec3b-41e6-b179-6c196ad1fe49-kube-api-access-4fb8j\") pod \"cinder4662-account-delete-kl2sp\" (UID: \"8e098e6f-ec3b-41e6-b179-6c196ad1fe49\") " pod="openstack/cinder4662-account-delete-kl2sp" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.122709 4962 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" secret="" err="secret \"barbican-barbican-dockercfg-pk28p\" not found" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.122767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.153992 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v8rg\" (UniqueName: \"kubernetes.io/projected/56923e91-36c0-432d-8042-138d2e89eb3b-kube-api-access-6v8rg\") pod \"neutron8f54-account-delete-pznh9\" (UID: \"56923e91-36c0-432d-8042-138d2e89eb3b\") " pod="openstack/neutron8f54-account-delete-pznh9" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.161857 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement453e-account-delete-v68p5" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.205838 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder4662-account-delete-kl2sp"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.208171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fb8j\" (UniqueName: \"kubernetes.io/projected/8e098e6f-ec3b-41e6-b179-6c196ad1fe49-kube-api-access-4fb8j\") pod \"cinder4662-account-delete-kl2sp\" (UID: \"8e098e6f-ec3b-41e6-b179-6c196ad1fe49\") " pod="openstack/cinder4662-account-delete-kl2sp" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.208246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.208291 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.208539 4962 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.208594 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:30.708580262 +0000 UTC m=+1599.112478097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.208950 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.208974 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:31.208966222 +0000 UTC m=+1599.612864057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.209004 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.209024 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:30.709018574 +0000 UTC m=+1599.112916409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.216550 4962 projected.go:194] Error preparing data for projected volume kube-api-access-9bx2f for pod openstack/barbican-api-7c44799d88-mmmm6: failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.216752 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:31.216726001 +0000 UTC m=+1599.620623836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9bx2f" (UniqueName: "kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.224486 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0b903-account-delete-dqf57"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.231660 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0b903-account-delete-dqf57"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.239590 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fb8j\" (UniqueName: \"kubernetes.io/projected/8e098e6f-ec3b-41e6-b179-6c196ad1fe49-kube-api-access-4fb8j\") pod \"cinder4662-account-delete-kl2sp\" (UID: \"8e098e6f-ec3b-41e6-b179-6c196ad1fe49\") " pod="openstack/cinder4662-account-delete-kl2sp" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.239714 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0b903-account-delete-dqf57" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.311596 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd26c\" (UniqueName: \"kubernetes.io/projected/e09f26ad-247c-477a-9d73-a2a0f8df91e8-kube-api-access-hd26c\") pod \"novacell0b903-account-delete-dqf57\" (UID: \"e09f26ad-247c-477a-9d73-a2a0f8df91e8\") " pod="openstack/novacell0b903-account-delete-dqf57" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.363703 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c86a63-d2f6-4f22-9d04-d6128fa7c31a" path="/var/lib/kubelet/pods/29c86a63-d2f6-4f22-9d04-d6128fa7c31a/volumes" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.365115 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56" path="/var/lib/kubelet/pods/f4e7b0e6-13e7-4614-b7ee-23c87a2f4d56/volumes" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.365693 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi6963-account-delete-mg78g"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.367031 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi6963-account-delete-mg78g"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.367049 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6sqdm"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.367061 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-wvjpm"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.367082 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-9h8s9"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.367093 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-94dnt"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.367186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi6963-account-delete-mg78g" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.367254 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" podUID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" containerName="dnsmasq-dns" containerID="cri-o://ad57452006db6a8d5f23250d941709e3e1778f52c203a43209011704610ba216" gracePeriod=10 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.368096 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-9h8s9" podUID="67b77bc9-27ae-4994-86c2-614e48ad33c6" containerName="openstack-network-exporter" containerID="cri-o://2733a21e59053a9fe777da7da96a6b1c13acb88fa95f66b9a9cc3889e027399a" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.376199 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-gkbr2"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.393913 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance2f72-account-delete-wdbvb" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.428107 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd26c\" (UniqueName: \"kubernetes.io/projected/e09f26ad-247c-477a-9d73-a2a0f8df91e8-kube-api-access-hd26c\") pod \"novacell0b903-account-delete-dqf57\" (UID: \"e09f26ad-247c-477a-9d73-a2a0f8df91e8\") " pod="openstack/novacell0b903-account-delete-dqf57" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.431360 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron8f54-account-delete-pznh9" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.463477 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4662-account-delete-kl2sp" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.477219 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.487686 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd26c\" (UniqueName: \"kubernetes.io/projected/e09f26ad-247c-477a-9d73-a2a0f8df91e8-kube-api-access-hd26c\") pod \"novacell0b903-account-delete-dqf57\" (UID: \"e09f26ad-247c-477a-9d73-a2a0f8df91e8\") " pod="openstack/novacell0b903-account-delete-dqf57" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.546690 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-gkbr2"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.547826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpv8s\" (UniqueName: \"kubernetes.io/projected/1b763061-bb23-4c23-a4ec-bebac231c603-kube-api-access-dpv8s\") pod \"novaapi6963-account-delete-mg78g\" (UID: \"1b763061-bb23-4c23-a4ec-bebac231c603\") " pod="openstack/novaapi6963-account-delete-mg78g" Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.548520 4962 secret.go:188] Couldn't get secret openstack/barbican-api-config-data: secret "barbican-api-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.548564 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:31.048550644 +0000 UTC m=+1599.452448479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-api-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.548603 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.548622 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data podName:0ae87940-f07d-4213-bc0b-da0b3a2bba84 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:31.548616305 +0000 UTC m=+1599.952514140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data") pod "barbican-worker-c97f5c65f-s279k" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84") : secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.578239 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-kz6b9"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.607032 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0b903-account-delete-dqf57" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.618909 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-kz6b9"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.651335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpv8s\" (UniqueName: \"kubernetes.io/projected/1b763061-bb23-4c23-a4ec-bebac231c603-kube-api-access-dpv8s\") pod \"novaapi6963-account-delete-mg78g\" (UID: \"1b763061-bb23-4c23-a4ec-bebac231c603\") " pod="openstack/novaapi6963-account-delete-mg78g" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.651713 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-hrx4m"] Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.652803 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.652873 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data podName:862ad9df-af58-4304-9ad5-7faba334e2d9 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:31.152857525 +0000 UTC m=+1599.556755350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data") pod "rabbitmq-server-0" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9") : configmap "rabbitmq-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.667356 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6456949cf6-r4n9q"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.667620 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6456949cf6-r4n9q" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-log" containerID="cri-o://e9b4cb84ef4c21a8595bf182936df461ef5cc7e4bb630f5cdc4490a12d404462" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.668037 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6456949cf6-r4n9q" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-api" containerID="cri-o://b92f6a632cc9c0dff9f450965eee31724d28f079d91c0b8852b080e2ed919e29" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.688811 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-hrx4m"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.689951 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpv8s\" (UniqueName: \"kubernetes.io/projected/1b763061-bb23-4c23-a4ec-bebac231c603-kube-api-access-dpv8s\") pod \"novaapi6963-account-delete-mg78g\" (UID: \"1b763061-bb23-4c23-a4ec-bebac231c603\") " pod="openstack/novaapi6963-account-delete-mg78g" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.735756 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.736588 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-server" containerID="cri-o://1953908fb8f3a3d9cd983f3f51df79d091442b3b2351b35d0c858fe9e4a4b278" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737108 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="swift-recon-cron" containerID="cri-o://972fd0b549604163530a4df17ba0265931587abd268311d752380aa374952bb0" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737153 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="rsync" containerID="cri-o://06a77f3c0c79be2df9a379f7d27bcc5d75db28ce32e20e774dd964566de558be" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737187 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-expirer" containerID="cri-o://255dd1c4cb38e6b82f47f9c570da57cc07f7f5e8c11c54bb9966d8c730771ef6" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737215 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-updater" containerID="cri-o://054512bf0c273e329a55f18a262ffbcb7dd5abaded475341723d2b4dc5e849fb" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737246 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-auditor" containerID="cri-o://7e76ff2eb3cf5160a1fdce8ab7db2a70edda0c5fb436d79cb130e11be846580e" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737301 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-replicator" containerID="cri-o://f0c27459819cd1d481d672fbdd91f735b40d36cc170361880628b5b806924c13" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737331 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-server" containerID="cri-o://a8806f325247419ebf9ee453e77f3493ec2be61562010341eec779899b644330" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737369 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-updater" containerID="cri-o://bccc65019a49e86470400d5863a1f0f3a4c53c7dead1edd7e4226173d7443ed2" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737397 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-auditor" containerID="cri-o://7ce66520fa57254d5157448844d739ec610586be59fb60789632b9b85bd02222" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737426 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-replicator" containerID="cri-o://d07a3c21e9f7cc962ded5767c572003a88953fe191cf331895a5e9e48103288b" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737454 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-server" containerID="cri-o://0027d40b3fd7f4cac601a15c7999e155ece2e4687617a83b85e72dd63015f85e" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737484 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-reaper" containerID="cri-o://5674271fdefce19a13c9b336c52d013de2b33a9a9124fca33358f8e3a0cf5881" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737512 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-auditor" containerID="cri-o://770e06f348aaf3989bf45ae8703e1cff216acdce48c3de88da4323e4ade168ff" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.737538 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-replicator" containerID="cri-o://33714c39a9c3c16769f323fb660866cd0b5a9c6bf72751670fa0465d513c70cb" gracePeriod=30 Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.750770 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-c6gvp"] Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.754936 4962 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.755004 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:31.754988329 +0000 UTC m=+1600.158886154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.755923 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.755951 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:31.755941964 +0000 UTC m=+1600.159839799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.755981 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.756001 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data podName:221bdd26-0fec-49e5-86ec-c2aefe7a5902 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:32.755992716 +0000 UTC m=+1601.159890551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data") pod "rabbitmq-cell1-server-0" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902") : configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.767343 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-c6gvp"] Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.772793 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85 is running failed: container process not found" containerID="40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.778107 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hnv5m"] Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.784084 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85 is running failed: container process not found" containerID="40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.786517 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85 is running failed: container process not found" containerID="40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.786581 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="ovsdbserver-sb" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.878911 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hnv5m"] Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.934805 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi6963-account-delete-mg78g" Oct 03 13:16:30 crc kubenswrapper[4962]: I1003 13:16:30.935860 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-2wz97"] Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.985984 4962 configmap.go:193] Couldn't get configMap openstack/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Oct 03 13:16:30 crc kubenswrapper[4962]: E1003 13:16:30.986305 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts podName:6d6f62dd-0720-46b6-b0a8-497490f052a8 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:32.986288811 +0000 UTC m=+1601.390186636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts") pod "ovn-controller-6sqdm" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8") : configmap "ovncontroller-scripts" not found Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.010698 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-2wz97"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.043666 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2af174c7-cf23-452c-bc13-ecda2775d58d/ovsdbserver-nb/0.log" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.043723 4962 generic.go:334] "Generic (PLEG): container finished" podID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerID="cc095d6f6d8b5824a32ad688e66fd5a34700bd68a8048d5bb8c9727930860221" exitCode=143 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.043793 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2af174c7-cf23-452c-bc13-ecda2775d58d","Type":"ContainerDied","Data":"cc095d6f6d8b5824a32ad688e66fd5a34700bd68a8048d5bb8c9727930860221"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.043820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2af174c7-cf23-452c-bc13-ecda2775d58d","Type":"ContainerDied","Data":"f7edaafcf712e9901b44555a1a8056a3daf3444e4bf017a4a0cf844d35572fdb"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.043830 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7edaafcf712e9901b44555a1a8056a3daf3444e4bf017a4a0cf844d35572fdb" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.053416 4962 generic.go:334] "Generic (PLEG): container finished" podID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerID="e9b4cb84ef4c21a8595bf182936df461ef5cc7e4bb630f5cdc4490a12d404462" exitCode=143 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.053475 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6456949cf6-r4n9q" event={"ID":"1289d443-56d2-4f63-8802-66bcd0569b3b","Type":"ContainerDied","Data":"e9b4cb84ef4c21a8595bf182936df461ef5cc7e4bb630f5cdc4490a12d404462"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.059361 4962 generic.go:334] "Generic (PLEG): container finished" podID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" containerID="ad57452006db6a8d5f23250d941709e3e1778f52c203a43209011704610ba216" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.059427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" event={"ID":"eb190059-74a6-4ffe-88a4-5fcfd46812a0","Type":"ContainerDied","Data":"ad57452006db6a8d5f23250d941709e3e1778f52c203a43209011704610ba216"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.068668 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6313803e-1bf1-4a99-8af7-cb80c0e6321c/ovsdbserver-sb/0.log" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.068717 4962 generic.go:334] "Generic (PLEG): container finished" podID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerID="40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85" exitCode=143 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.068796 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6313803e-1bf1-4a99-8af7-cb80c0e6321c","Type":"ContainerDied","Data":"40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85"} Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.088314 4962 secret.go:188] Couldn't get secret openstack/barbican-api-config-data: secret "barbican-api-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.088385 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:32.088367843 +0000 UTC m=+1600.492265668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-api-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.109803 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.110236 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b0da1427-1e89-42d6-beb2-55f292945177" containerName="glance-log" containerID="cri-o://b27da2f01290ecf61072efad87218e000a6819ad0aac516d4d56189f22787d6c" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.110626 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b0da1427-1e89-42d6-beb2-55f292945177" containerName="glance-httpd" containerID="cri-o://c5177f0f305f7d8efd50064ca1ae9320ecca80819662d7f114a735ed39509584" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.178772 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" containerID="cri-o://923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.190012 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.190084 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data podName:862ad9df-af58-4304-9ad5-7faba334e2d9 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:32.190069165 +0000 UTC m=+1600.593967000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data") pod "rabbitmq-server-0" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9") : configmap "rabbitmq-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.201617 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.201903 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerName="glance-log" containerID="cri-o://ac73c42dc924c54bc8c349e88f72de0cd595955349b0de465efb0b5629a1c596" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.202391 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerName="glance-httpd" containerID="cri-o://7377486e2b45b77a0115044df6ada84243926afe00f2d9d4d7481b7f35b3b3cd" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.218565 4962 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 03 13:16:31 crc kubenswrapper[4962]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 13:16:31 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNBridge=br-int Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Oct 03 13:16:31 crc kubenswrapper[4962]: ++ PhysicalNetworks= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNHostName= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 13:16:31 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Oct 03 13:16:31 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 13:16:31 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 13:16:31 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 13:16:31 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 13:16:31 crc kubenswrapper[4962]: + sleep 0.5 Oct 03 13:16:31 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 13:16:31 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Oct 03 13:16:31 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 13:16:31 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 13:16:31 crc kubenswrapper[4962]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-wvjpm" message=< Oct 03 13:16:31 crc kubenswrapper[4962]: Exiting ovsdb-server (5) [ OK ] Oct 03 13:16:31 crc kubenswrapper[4962]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 13:16:31 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNBridge=br-int Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Oct 03 13:16:31 crc kubenswrapper[4962]: ++ PhysicalNetworks= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNHostName= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 13:16:31 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Oct 03 13:16:31 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 13:16:31 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 13:16:31 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 13:16:31 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 13:16:31 crc kubenswrapper[4962]: + sleep 0.5 Oct 03 13:16:31 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 13:16:31 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Oct 03 13:16:31 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 13:16:31 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 13:16:31 crc kubenswrapper[4962]: > Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.218609 4962 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 03 13:16:31 crc kubenswrapper[4962]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 13:16:31 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNBridge=br-int Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Oct 03 13:16:31 crc kubenswrapper[4962]: ++ PhysicalNetworks= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ OVNHostName= Oct 03 13:16:31 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 13:16:31 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Oct 03 13:16:31 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 13:16:31 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 13:16:31 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 13:16:31 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 13:16:31 crc kubenswrapper[4962]: + sleep 0.5 Oct 03 13:16:31 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 13:16:31 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Oct 03 13:16:31 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 13:16:31 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 13:16:31 crc kubenswrapper[4962]: > pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" containerID="cri-o://34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.218667 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" containerID="cri-o://34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.251745 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f745c6cff-9rkw7"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.252011 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f745c6cff-9rkw7" podUID="40dc7e17-4436-4452-a266-65d57a67779d" containerName="neutron-api" containerID="cri-o://262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.252476 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f745c6cff-9rkw7" podUID="40dc7e17-4436-4452-a266-65d57a67779d" containerName="neutron-httpd" containerID="cri-o://5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.260003 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2af174c7-cf23-452c-bc13-ecda2775d58d/ovsdbserver-nb/0.log" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.260068 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265653 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="054512bf0c273e329a55f18a262ffbcb7dd5abaded475341723d2b4dc5e849fb" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265683 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="7e76ff2eb3cf5160a1fdce8ab7db2a70edda0c5fb436d79cb130e11be846580e" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265690 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="f0c27459819cd1d481d672fbdd91f735b40d36cc170361880628b5b806924c13" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265697 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="bccc65019a49e86470400d5863a1f0f3a4c53c7dead1edd7e4226173d7443ed2" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265704 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="7ce66520fa57254d5157448844d739ec610586be59fb60789632b9b85bd02222" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265710 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="d07a3c21e9f7cc962ded5767c572003a88953fe191cf331895a5e9e48103288b" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265720 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="5674271fdefce19a13c9b336c52d013de2b33a9a9124fca33358f8e3a0cf5881" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265727 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="33714c39a9c3c16769f323fb660866cd0b5a9c6bf72751670fa0465d513c70cb" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265773 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"054512bf0c273e329a55f18a262ffbcb7dd5abaded475341723d2b4dc5e849fb"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"7e76ff2eb3cf5160a1fdce8ab7db2a70edda0c5fb436d79cb130e11be846580e"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265806 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"f0c27459819cd1d481d672fbdd91f735b40d36cc170361880628b5b806924c13"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"bccc65019a49e86470400d5863a1f0f3a4c53c7dead1edd7e4226173d7443ed2"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265824 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"7ce66520fa57254d5157448844d739ec610586be59fb60789632b9b85bd02222"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265832 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"d07a3c21e9f7cc962ded5767c572003a88953fe191cf331895a5e9e48103288b"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265841 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"5674271fdefce19a13c9b336c52d013de2b33a9a9124fca33358f8e3a0cf5881"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.265849 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"33714c39a9c3c16769f323fb660866cd0b5a9c6bf72751670fa0465d513c70cb"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296030 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-config\") pod \"2af174c7-cf23-452c-bc13-ecda2775d58d\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296314 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2af174c7-cf23-452c-bc13-ecda2775d58d\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdbserver-nb-tls-certs\") pod \"2af174c7-cf23-452c-bc13-ecda2775d58d\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296484 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-combined-ca-bundle\") pod \"2af174c7-cf23-452c-bc13-ecda2775d58d\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296528 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-scripts\") pod \"2af174c7-cf23-452c-bc13-ecda2775d58d\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296564 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdb-rundir\") pod \"2af174c7-cf23-452c-bc13-ecda2775d58d\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296594 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-metrics-certs-tls-certs\") pod \"2af174c7-cf23-452c-bc13-ecda2775d58d\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296663 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4slpf\" (UniqueName: \"kubernetes.io/projected/2af174c7-cf23-452c-bc13-ecda2775d58d-kube-api-access-4slpf\") pod \"2af174c7-cf23-452c-bc13-ecda2775d58d\" (UID: \"2af174c7-cf23-452c-bc13-ecda2775d58d\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.296870 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-config" (OuterVolumeSpecName: "config") pod "2af174c7-cf23-452c-bc13-ecda2775d58d" (UID: "2af174c7-cf23-452c-bc13-ecda2775d58d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.297125 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.297166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.297209 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.298206 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-scripts" (OuterVolumeSpecName: "scripts") pod "2af174c7-cf23-452c-bc13-ecda2775d58d" (UID: "2af174c7-cf23-452c-bc13-ecda2775d58d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.298909 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2af174c7-cf23-452c-bc13-ecda2775d58d" (UID: "2af174c7-cf23-452c-bc13-ecda2775d58d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.299719 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.299797 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:33.299778952 +0000 UTC m=+1601.703676777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.305518 4962 projected.go:194] Error preparing data for projected volume kube-api-access-9bx2f for pod openstack/barbican-api-7c44799d88-mmmm6: failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.305583 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:33.305565257 +0000 UTC m=+1601.709463082 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9bx2f" (UniqueName: "kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.383144 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.383462 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-log" containerID="cri-o://ae65644f1732d90a709b491cf3a8a7c7ca3acd9268609750aa5824cea960f345" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.384033 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-metadata" containerID="cri-o://3e39d9d9f9ef98752b8331bca21aeea43f0d1658741d17c5c7ad0d95aa684075" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.385427 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.385671 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-log" containerID="cri-o://024390e769ef5aec351b0a25e277cb3becb27c08b4c0917b93bfe0e1e2975164" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.387173 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-api" containerID="cri-o://05b9f6f45511688aab50c27ba2558d43869979d7395ad2e599ba9c06301661e8" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.391450 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af174c7-cf23-452c-bc13-ecda2775d58d-kube-api-access-4slpf" (OuterVolumeSpecName: "kube-api-access-4slpf") pod "2af174c7-cf23-452c-bc13-ecda2775d58d" (UID: "2af174c7-cf23-452c-bc13-ecda2775d58d"). InnerVolumeSpecName "kube-api-access-4slpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.397822 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.400117 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af174c7-cf23-452c-bc13-ecda2775d58d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.400440 4962 generic.go:334] "Generic (PLEG): container finished" podID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerID="ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261" exitCode=0 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.400530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"009b2959-1113-4574-a2ec-90bbe2d8f8ef","Type":"ContainerDied","Data":"ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261"} Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.400543 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.400589 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4slpf\" (UniqueName: \"kubernetes.io/projected/2af174c7-cf23-452c-bc13-ecda2775d58d-kube-api-access-4slpf\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.410140 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.410299 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-djwxp"] Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.410618 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.411110 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.411169 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.411186 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.413175 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9h8s9_67b77bc9-27ae-4994-86c2-614e48ad33c6/openstack-network-exporter/0.log" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.413220 4962 generic.go:334] "Generic (PLEG): container finished" podID="67b77bc9-27ae-4994-86c2-614e48ad33c6" containerID="2733a21e59053a9fe777da7da96a6b1c13acb88fa95f66b9a9cc3889e027399a" exitCode=2 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.413250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9h8s9" event={"ID":"67b77bc9-27ae-4994-86c2-614e48ad33c6","Type":"ContainerDied","Data":"2733a21e59053a9fe777da7da96a6b1c13acb88fa95f66b9a9cc3889e027399a"} Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.413290 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.414401 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.414427 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.418358 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "2af174c7-cf23-452c-bc13-ecda2775d58d" (UID: "2af174c7-cf23-452c-bc13-ecda2775d58d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.426956 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-djwxp"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.449746 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2af174c7-cf23-452c-bc13-ecda2775d58d" (UID: "2af174c7-cf23-452c-bc13-ecda2775d58d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.452418 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d10f-account-create-nxl99"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.461884 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d10f-account-create-nxl99"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.480758 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-q9c2h"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.492877 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-q9c2h"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.503374 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.509998 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.510033 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.513756 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4662-account-delete-kl2sp"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.527566 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4662-account-create-85cwj"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.534520 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4662-account-create-85cwj"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.541399 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-b2fwt"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.548407 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ace4-account-create-qg6md"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.566036 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6313803e-1bf1-4a99-8af7-cb80c0e6321c/ovsdbserver-sb/0.log" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.566125 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.569610 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-b2fwt"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.576870 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ace4-account-create-qg6md"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.588497 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9lwfl"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.595594 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.602352 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9h8s9_67b77bc9-27ae-4994-86c2-614e48ad33c6/openstack-network-exporter/0.log" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.602425 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.602490 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b903-account-create-gkqkx"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.614968 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-config\") pod \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.615079 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.615119 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-metrics-certs-tls-certs\") pod \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.615161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-scripts\") pod \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.615190 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdb-rundir\") pod \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.615221 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-combined-ca-bundle\") pod \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.615272 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdbserver-sb-tls-certs\") pod \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.615337 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c7fm\" (UniqueName: \"kubernetes.io/projected/6313803e-1bf1-4a99-8af7-cb80c0e6321c-kube-api-access-2c7fm\") pod \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\" (UID: \"6313803e-1bf1-4a99-8af7-cb80c0e6321c\") " Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.615814 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.615852 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data podName:0ae87940-f07d-4213-bc0b-da0b3a2bba84 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:33.615839681 +0000 UTC m=+1602.019737516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data") pod "barbican-worker-c97f5c65f-s279k" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84") : secret "barbican-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.616589 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-config" (OuterVolumeSpecName: "config") pod "6313803e-1bf1-4a99-8af7-cb80c0e6321c" (UID: "6313803e-1bf1-4a99-8af7-cb80c0e6321c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.620085 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "6313803e-1bf1-4a99-8af7-cb80c0e6321c" (UID: "6313803e-1bf1-4a99-8af7-cb80c0e6321c"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.629613 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-scripts" (OuterVolumeSpecName: "scripts") pod "6313803e-1bf1-4a99-8af7-cb80c0e6321c" (UID: "6313803e-1bf1-4a99-8af7-cb80c0e6321c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.629743 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b903-account-create-gkqkx"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.630255 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6313803e-1bf1-4a99-8af7-cb80c0e6321c" (UID: "6313803e-1bf1-4a99-8af7-cb80c0e6321c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.631669 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9lwfl"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.633687 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6313803e-1bf1-4a99-8af7-cb80c0e6321c-kube-api-access-2c7fm" (OuterVolumeSpecName: "kube-api-access-2c7fm") pod "6313803e-1bf1-4a99-8af7-cb80c0e6321c" (UID: "6313803e-1bf1-4a99-8af7-cb80c0e6321c"). InnerVolumeSpecName "kube-api-access-2c7fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.642904 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0b903-account-delete-dqf57"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.650094 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7xjlw"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.654872 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6963-account-create-gqhqw"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.667207 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerName="rabbitmq" containerID="cri-o://ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493" gracePeriod=604800 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.682967 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.692915 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6963-account-create-gqhqw"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.705143 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6313803e-1bf1-4a99-8af7-cb80c0e6321c" (UID: "6313803e-1bf1-4a99-8af7-cb80c0e6321c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718485 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovs-rundir\") pod \"67b77bc9-27ae-4994-86c2-614e48ad33c6\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718578 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-config\") pod \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-combined-ca-bundle\") pod \"67b77bc9-27ae-4994-86c2-614e48ad33c6\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718733 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-svc\") pod \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718761 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-metrics-certs-tls-certs\") pod \"67b77bc9-27ae-4994-86c2-614e48ad33c6\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718790 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovn-rundir\") pod \"67b77bc9-27ae-4994-86c2-614e48ad33c6\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718817 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-swift-storage-0\") pod \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb9b8\" (UniqueName: \"kubernetes.io/projected/eb190059-74a6-4ffe-88a4-5fcfd46812a0-kube-api-access-qb9b8\") pod \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718867 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b77bc9-27ae-4994-86c2-614e48ad33c6-config\") pod \"67b77bc9-27ae-4994-86c2-614e48ad33c6\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718892 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-nb\") pod \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718935 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jldg6\" (UniqueName: \"kubernetes.io/projected/67b77bc9-27ae-4994-86c2-614e48ad33c6-kube-api-access-jldg6\") pod \"67b77bc9-27ae-4994-86c2-614e48ad33c6\" (UID: \"67b77bc9-27ae-4994-86c2-614e48ad33c6\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.718981 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-sb\") pod \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\" (UID: \"eb190059-74a6-4ffe-88a4-5fcfd46812a0\") " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719375 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c7fm\" (UniqueName: \"kubernetes.io/projected/6313803e-1bf1-4a99-8af7-cb80c0e6321c-kube-api-access-2c7fm\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719387 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719396 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719414 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719423 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6313803e-1bf1-4a99-8af7-cb80c0e6321c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719431 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719440 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719870 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "67b77bc9-27ae-4994-86c2-614e48ad33c6" (UID: "67b77bc9-27ae-4994-86c2-614e48ad33c6"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.719922 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "67b77bc9-27ae-4994-86c2-614e48ad33c6" (UID: "67b77bc9-27ae-4994-86c2-614e48ad33c6"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.722451 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67b77bc9-27ae-4994-86c2-614e48ad33c6-config" (OuterVolumeSpecName: "config") pod "67b77bc9-27ae-4994-86c2-614e48ad33c6" (UID: "67b77bc9-27ae-4994-86c2-614e48ad33c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.729280 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb190059-74a6-4ffe-88a4-5fcfd46812a0-kube-api-access-qb9b8" (OuterVolumeSpecName: "kube-api-access-qb9b8") pod "eb190059-74a6-4ffe-88a4-5fcfd46812a0" (UID: "eb190059-74a6-4ffe-88a4-5fcfd46812a0"). InnerVolumeSpecName "kube-api-access-qb9b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.733332 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b77bc9-27ae-4994-86c2-614e48ad33c6-kube-api-access-jldg6" (OuterVolumeSpecName: "kube-api-access-jldg6") pod "67b77bc9-27ae-4994-86c2-614e48ad33c6" (UID: "67b77bc9-27ae-4994-86c2-614e48ad33c6"). InnerVolumeSpecName "kube-api-access-jldg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.743824 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="438da193-7b02-4101-a45c-9e0f83c41051" containerName="galera" containerID="cri-o://975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.743952 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7xjlw"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.747624 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi6963-account-delete-mg78g"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.752729 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "2af174c7-cf23-452c-bc13-ecda2775d58d" (UID: "2af174c7-cf23-452c-bc13-ecda2775d58d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.758169 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.772425 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.772868 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="85710c21-98fe-4148-8ef1-ec9f4e9ef311" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.779062 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b7fd754f4-rx9k9"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.779386 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b7fd754f4-rx9k9" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api-log" containerID="cri-o://2fe1fb6596fd4bab23e8f5b8fffa9f204b8288c3c24b70ea1583257b39048287" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.779538 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b7fd754f4-rx9k9" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api" containerID="cri-o://efdd11f8bd8386aa2fc051d59f9344ed094988bb97638532765b2b52ec56a7ba" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.791513 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c44799d88-mmmm6"] Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.793624 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data kube-api-access-9bx2f], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/barbican-api-7c44799d88-mmmm6" podUID="dab0e7ec-9c64-491d-a655-027098042378" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.807260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2af174c7-cf23-452c-bc13-ecda2775d58d" (UID: "2af174c7-cf23-452c-bc13-ecda2775d58d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.824767 4962 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.824958 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.825024 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/67b77bc9-27ae-4994-86c2-614e48ad33c6-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.825078 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb9b8\" (UniqueName: \"kubernetes.io/projected/eb190059-74a6-4ffe-88a4-5fcfd46812a0-kube-api-access-qb9b8\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.825127 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.825234 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:33.825203025 +0000 UTC m=+1602.229100860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.824991 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59bf856dfd-t86xg"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.825131 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b77bc9-27ae-4994-86c2-614e48ad33c6-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.825308 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af174c7-cf23-452c-bc13-ecda2775d58d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.825326 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.825339 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jldg6\" (UniqueName: \"kubernetes.io/projected/67b77bc9-27ae-4994-86c2-614e48ad33c6-kube-api-access-jldg6\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.825438 4962 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: E1003 13:16:31.825583 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:33.825563645 +0000 UTC m=+1602.229461480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.882399 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-bd6989694-qnv2s"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.882825 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerName="barbican-keystone-listener-log" containerID="cri-o://86b2ecabe6ed78973d278885e95f62a78704e8d2b70f094254e52932b6e6c618" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.883473 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerName="barbican-keystone-listener" containerID="cri-o://31ec554c86926fa60a6d1b72601dfc4ca004a83f21bd45f79846d05688388ebf" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.927569 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-77d888f4df-52rjc"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.927816 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-77d888f4df-52rjc" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerName="barbican-worker-log" containerID="cri-o://b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.928149 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-77d888f4df-52rjc" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerName="barbican-worker" containerID="cri-o://eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b" gracePeriod=30 Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.939833 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6313803e-1bf1-4a99-8af7-cb80c0e6321c" (UID: "6313803e-1bf1-4a99-8af7-cb80c0e6321c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.944434 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-c97f5c65f-s279k"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.951011 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.952055 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb190059-74a6-4ffe-88a4-5fcfd46812a0" (UID: "eb190059-74a6-4ffe-88a4-5fcfd46812a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.968987 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67b77bc9-27ae-4994-86c2-614e48ad33c6" (UID: "67b77bc9-27ae-4994-86c2-614e48ad33c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.969599 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:16:31 crc kubenswrapper[4962]: I1003 13:16:31.969857 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d36308a0-1b17-4986-adb2-2833b444a239" containerName="nova-scheduler-scheduler" containerID="cri-o://d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3" gracePeriod=30 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.004008 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-c97f5c65f-s279k"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.008490 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb190059-74a6-4ffe-88a4-5fcfd46812a0" (UID: "eb190059-74a6-4ffe-88a4-5fcfd46812a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.014269 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerName="rabbitmq" containerID="cri-o://ffd50edd39ffcd28008bcc1779cfab62afd89dc665b9322ffc084495ca1c56d2" gracePeriod=604800 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.030961 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.030988 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.030999 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.031007 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.056700 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59bf856dfd-t86xg"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.067126 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement453e-account-delete-v68p5"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.068556 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb190059-74a6-4ffe-88a4-5fcfd46812a0" (UID: "eb190059-74a6-4ffe-88a4-5fcfd46812a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.071510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-config" (OuterVolumeSpecName: "config") pod "eb190059-74a6-4ffe-88a4-5fcfd46812a0" (UID: "eb190059-74a6-4ffe-88a4-5fcfd46812a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.077167 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.077411 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="d22955d6-a957-458f-8181-5fea18cedc90" containerName="nova-cell1-conductor-conductor" containerID="cri-o://0ce8b4a4ac9d0b8cc21e8feaa51be48f8d9e21fc13fb4c98b22efe982cb9565b" gracePeriod=30 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.085600 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eb190059-74a6-4ffe-88a4-5fcfd46812a0" (UID: "eb190059-74a6-4ffe-88a4-5fcfd46812a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.085684 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkw44"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.097060 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkw44"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.113569 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrtt5"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.124649 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrtt5"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.131648 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.131986 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="85ea0653-966b-47ff-b8aa-b6ad2b5810ca" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c9c8ed01d13ca0f8ff902a9439408e51baa90fc268f710fa26c3c09fc4aeec3c" gracePeriod=30 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.132941 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.132962 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.132971 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb190059-74a6-4ffe-88a4-5fcfd46812a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.133236 4962 secret.go:188] Couldn't get secret openstack/barbican-api-config-data: secret "barbican-api-config-data" not found Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.133341 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:34.133320581 +0000 UTC m=+1602.537218416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-api-config-data" not found Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.141394 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron8f54-account-delete-pznh9"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.150660 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance2f72-account-delete-wdbvb"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.154954 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "67b77bc9-27ae-4994-86c2-614e48ad33c6" (UID: "67b77bc9-27ae-4994-86c2-614e48ad33c6"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.189446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "6313803e-1bf1-4a99-8af7-cb80c0e6321c" (UID: "6313803e-1bf1-4a99-8af7-cb80c0e6321c"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.235285 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6313803e-1bf1-4a99-8af7-cb80c0e6321c-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.235391 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.235434 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b77bc9-27ae-4994-86c2-614e48ad33c6-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.235500 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data podName:862ad9df-af58-4304-9ad5-7faba334e2d9 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:34.235480645 +0000 UTC m=+1602.639378480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data") pod "rabbitmq-server-0" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9") : configmap "rabbitmq-config-data" not found Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.261974 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b" path="/var/lib/kubelet/pods/0b9e6c89-714e-4efd-9adc-c15cd5b3eb6b/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.263058 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbdbc1d-a3bc-4bd3-80aa-a590977a4329" path="/var/lib/kubelet/pods/0fbdbc1d-a3bc-4bd3-80aa-a590977a4329/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.263595 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="164dbb03-b802-46d3-8dbd-1b6dc90cda51" path="/var/lib/kubelet/pods/164dbb03-b802-46d3-8dbd-1b6dc90cda51/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.264240 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22874ecf-641f-46a1-bbb5-4d27b38bf001" path="/var/lib/kubelet/pods/22874ecf-641f-46a1-bbb5-4d27b38bf001/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.269795 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280fc068-9a62-474f-a81f-fc5a28a7e722" path="/var/lib/kubelet/pods/280fc068-9a62-474f-a81f-fc5a28a7e722/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.273649 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f6cf3d-edc9-48a9-8c78-0732b6693293" path="/var/lib/kubelet/pods/32f6cf3d-edc9-48a9-8c78-0732b6693293/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.274523 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4129f89f-a08d-4374-9661-15e30b59b01f" path="/var/lib/kubelet/pods/4129f89f-a08d-4374-9661-15e30b59b01f/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.282613 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48bf959f-6a35-4204-bd4d-6e0a62a2a7db" path="/var/lib/kubelet/pods/48bf959f-6a35-4204-bd4d-6e0a62a2a7db/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.284341 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61cfe20f-97f2-444b-9a56-00a3c22d7ba7" path="/var/lib/kubelet/pods/61cfe20f-97f2-444b-9a56-00a3c22d7ba7/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.285780 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ffa74e-e3cd-4a30-8197-12bd37d64a65" path="/var/lib/kubelet/pods/62ffa74e-e3cd-4a30-8197-12bd37d64a65/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.291117 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67400081-5ed3-48dc-be64-b3cf19bcf3c4" path="/var/lib/kubelet/pods/67400081-5ed3-48dc-be64-b3cf19bcf3c4/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.291676 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b7fa90-3c0c-4c9d-9709-fdd39699b685" path="/var/lib/kubelet/pods/68b7fa90-3c0c-4c9d-9709-fdd39699b685/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.292238 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5982a3-b38c-43bf-9edf-4e0216fb3374" path="/var/lib/kubelet/pods/6d5982a3-b38c-43bf-9edf-4e0216fb3374/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.293918 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72bb7aab-b5f7-46ba-bf39-471de4e5090f" path="/var/lib/kubelet/pods/72bb7aab-b5f7-46ba-bf39-471de4e5090f/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.299214 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7768c654-4c5b-44f0-944d-4c4507a252b3" path="/var/lib/kubelet/pods/7768c654-4c5b-44f0-944d-4c4507a252b3/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.299959 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="866c8e6b-3fdd-442c-98d4-cf44b6ef098c" path="/var/lib/kubelet/pods/866c8e6b-3fdd-442c-98d4-cf44b6ef098c/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.299979 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.311205 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c59dc3-1749-4e12-aa01-09b6ed9934d8" path="/var/lib/kubelet/pods/d8c59dc3-1749-4e12-aa01-09b6ed9934d8/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.332834 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7872ef4-f86f-430d-b77e-3249dbda3a80" path="/var/lib/kubelet/pods/f7872ef4-f86f-430d-b77e-3249dbda3a80/volumes" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.336148 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9dzd\" (UniqueName: \"kubernetes.io/projected/75210a15-c36f-4be9-9709-ceb4eb2c4646-kube-api-access-r9dzd\") pod \"75210a15-c36f-4be9-9709-ceb4eb2c4646\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.336227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config-secret\") pod \"75210a15-c36f-4be9-9709-ceb4eb2c4646\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.336300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-combined-ca-bundle\") pod \"75210a15-c36f-4be9-9709-ceb4eb2c4646\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.336419 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config\") pod \"75210a15-c36f-4be9-9709-ceb4eb2c4646\" (UID: \"75210a15-c36f-4be9-9709-ceb4eb2c4646\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.376043 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.397566 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75210a15-c36f-4be9-9709-ceb4eb2c4646-kube-api-access-r9dzd" (OuterVolumeSpecName: "kube-api-access-r9dzd") pod "75210a15-c36f-4be9-9709-ceb4eb2c4646" (UID: "75210a15-c36f-4be9-9709-ceb4eb2c4646"). InnerVolumeSpecName "kube-api-access-r9dzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.401302 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-68b6c975-4cb8j"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.401545 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-68b6c975-4cb8j" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-httpd" containerID="cri-o://fde2842fbb361eeef96c1f37f5ca7accd441dd0d4530fb4fc8d6c4be4392db6f" gracePeriod=30 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.401607 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-68b6c975-4cb8j" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-server" containerID="cri-o://036194e68dbb515945afa3ad089ad8f4474610c770e29c2e3ac03647eae66d7d" gracePeriod=30 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.438751 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-combined-ca-bundle\") pod \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.438870 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data-custom\") pod \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.439017 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/009b2959-1113-4574-a2ec-90bbe2d8f8ef-etc-machine-id\") pod \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.439072 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld8cq\" (UniqueName: \"kubernetes.io/projected/009b2959-1113-4574-a2ec-90bbe2d8f8ef-kube-api-access-ld8cq\") pod \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.439111 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-scripts\") pod \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.439338 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data\") pod \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\" (UID: \"009b2959-1113-4574-a2ec-90bbe2d8f8ef\") " Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.440279 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9dzd\" (UniqueName: \"kubernetes.io/projected/75210a15-c36f-4be9-9709-ceb4eb2c4646-kube-api-access-r9dzd\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.446814 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/009b2959-1113-4574-a2ec-90bbe2d8f8ef-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "009b2959-1113-4574-a2ec-90bbe2d8f8ef" (UID: "009b2959-1113-4574-a2ec-90bbe2d8f8ef"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.473906 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="06a77f3c0c79be2df9a379f7d27bcc5d75db28ce32e20e774dd964566de558be" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.473944 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="255dd1c4cb38e6b82f47f9c570da57cc07f7f5e8c11c54bb9966d8c730771ef6" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.473954 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="a8806f325247419ebf9ee453e77f3493ec2be61562010341eec779899b644330" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.473962 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="0027d40b3fd7f4cac601a15c7999e155ece2e4687617a83b85e72dd63015f85e" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.473970 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="770e06f348aaf3989bf45ae8703e1cff216acdce48c3de88da4323e4ade168ff" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.473981 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="1953908fb8f3a3d9cd983f3f51df79d091442b3b2351b35d0c858fe9e4a4b278" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.474039 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"06a77f3c0c79be2df9a379f7d27bcc5d75db28ce32e20e774dd964566de558be"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.474073 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"255dd1c4cb38e6b82f47f9c570da57cc07f7f5e8c11c54bb9966d8c730771ef6"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.474087 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"a8806f325247419ebf9ee453e77f3493ec2be61562010341eec779899b644330"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.474098 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"0027d40b3fd7f4cac601a15c7999e155ece2e4687617a83b85e72dd63015f85e"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.474108 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"770e06f348aaf3989bf45ae8703e1cff216acdce48c3de88da4323e4ade168ff"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.474118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"1953908fb8f3a3d9cd983f3f51df79d091442b3b2351b35d0c858fe9e4a4b278"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.475983 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009b2959-1113-4574-a2ec-90bbe2d8f8ef-kube-api-access-ld8cq" (OuterVolumeSpecName: "kube-api-access-ld8cq") pod "009b2959-1113-4574-a2ec-90bbe2d8f8ef" (UID: "009b2959-1113-4574-a2ec-90bbe2d8f8ef"). InnerVolumeSpecName "kube-api-access-ld8cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.478070 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-scripts" (OuterVolumeSpecName: "scripts") pod "009b2959-1113-4574-a2ec-90bbe2d8f8ef" (UID: "009b2959-1113-4574-a2ec-90bbe2d8f8ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.506284 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4662-account-delete-kl2sp"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.508973 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9h8s9_67b77bc9-27ae-4994-86c2-614e48ad33c6/openstack-network-exporter/0.log" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.509090 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9h8s9" event={"ID":"67b77bc9-27ae-4994-86c2-614e48ad33c6","Type":"ContainerDied","Data":"af26d22b1028d0209d5f2b95d69442c9d5417dbdbc47dd91ddd4362dee014076"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.509790 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9h8s9" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.510369 4962 scope.go:117] "RemoveContainer" containerID="2733a21e59053a9fe777da7da96a6b1c13acb88fa95f66b9a9cc3889e027399a" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.513901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "009b2959-1113-4574-a2ec-90bbe2d8f8ef" (UID: "009b2959-1113-4574-a2ec-90bbe2d8f8ef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.520856 4962 generic.go:334] "Generic (PLEG): container finished" podID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerID="ae65644f1732d90a709b491cf3a8a7c7ca3acd9268609750aa5824cea960f345" exitCode=143 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.520936 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd269d6d-5aa2-43c0-a23b-e76b52699d59","Type":"ContainerDied","Data":"ae65644f1732d90a709b491cf3a8a7c7ca3acd9268609750aa5824cea960f345"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.527711 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0b903-account-delete-dqf57"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.539533 4962 generic.go:334] "Generic (PLEG): container finished" podID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerID="b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59" exitCode=143 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.539670 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d888f4df-52rjc" event={"ID":"2ecb3944-c441-4879-8220-aa32d7436c1f","Type":"ContainerDied","Data":"b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.542730 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.542764 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/009b2959-1113-4574-a2ec-90bbe2d8f8ef-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.542775 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld8cq\" (UniqueName: \"kubernetes.io/projected/009b2959-1113-4574-a2ec-90bbe2d8f8ef-kube-api-access-ld8cq\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.542784 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.565321 4962 generic.go:334] "Generic (PLEG): container finished" podID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.565375 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvjpm" event={"ID":"7cb4dab0-1ffc-49d4-a229-1862a33d4caa","Type":"ContainerDied","Data":"34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.570045 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement453e-account-delete-v68p5" event={"ID":"b42a368b-6dd4-4bb0-83a8-d79138605ec9","Type":"ContainerStarted","Data":"8cc993a8ce67c8187a388f922b918d9316e02673b97d3cbfb45dea600eb87635"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.573312 4962 generic.go:334] "Generic (PLEG): container finished" podID="d329c4da-aa05-4c80-ab30-622eac56428a" containerID="024390e769ef5aec351b0a25e277cb3becb27c08b4c0917b93bfe0e1e2975164" exitCode=143 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.573376 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-9h8s9"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.573397 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d329c4da-aa05-4c80-ab30-622eac56428a","Type":"ContainerDied","Data":"024390e769ef5aec351b0a25e277cb3becb27c08b4c0917b93bfe0e1e2975164"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.575727 4962 generic.go:334] "Generic (PLEG): container finished" podID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerID="2fe1fb6596fd4bab23e8f5b8fffa9f204b8288c3c24b70ea1583257b39048287" exitCode=143 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.575764 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7fd754f4-rx9k9" event={"ID":"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8","Type":"ContainerDied","Data":"2fe1fb6596fd4bab23e8f5b8fffa9f204b8288c3c24b70ea1583257b39048287"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.578372 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6313803e-1bf1-4a99-8af7-cb80c0e6321c/ovsdbserver-sb/0.log" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.578423 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6313803e-1bf1-4a99-8af7-cb80c0e6321c","Type":"ContainerDied","Data":"b9a76d889ebb424a0ed720a6e50124c449e1a1c1dd26e18bc04643ed5691d0e6"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.578498 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.588871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" event={"ID":"e6f0fc0a-ae8e-445e-ad05-591b7ab00886","Type":"ContainerStarted","Data":"06506c18461a638a0a9db0ec818f7f15103331fa5960e8e6311c11eeb4b075e7"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.590659 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c97f5c65f-s279k" event={"ID":"0ae87940-f07d-4213-bc0b-da0b3a2bba84","Type":"ContainerStarted","Data":"8428c1bd276857bc0ab39f071664e6f6e2b7c216f27e36d161897444b38331fd"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.593147 4962 generic.go:334] "Generic (PLEG): container finished" podID="40dc7e17-4436-4452-a266-65d57a67779d" containerID="5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.593182 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f745c6cff-9rkw7" event={"ID":"40dc7e17-4436-4452-a266-65d57a67779d","Type":"ContainerDied","Data":"5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff"} Oct 03 13:16:32 crc kubenswrapper[4962]: W1003 13:16:32.596164 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09f26ad_247c_477a_9d73_a2a0f8df91e8.slice/crio-225b6cc28a0704912e8149bfafa358990703f143e731a1f8550a182b3c5d8e78 WatchSource:0}: Error finding container 225b6cc28a0704912e8149bfafa358990703f143e731a1f8550a182b3c5d8e78: Status 404 returned error can't find the container with id 225b6cc28a0704912e8149bfafa358990703f143e731a1f8550a182b3c5d8e78 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.623823 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-9h8s9"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.628300 4962 generic.go:334] "Generic (PLEG): container finished" podID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerID="8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad" exitCode=0 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.632360 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.628384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"009b2959-1113-4574-a2ec-90bbe2d8f8ef","Type":"ContainerDied","Data":"8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.638382 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"009b2959-1113-4574-a2ec-90bbe2d8f8ef","Type":"ContainerDied","Data":"639573e4e77df17686b8a4d8a374499b9c547d73dad7ab52276d6246560eb3cc"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.649475 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron8f54-account-delete-pznh9" event={"ID":"56923e91-36c0-432d-8042-138d2e89eb3b","Type":"ContainerStarted","Data":"31fff28172e9a5433374b5fa939f109a51e1fc927c4a63e107621de14afc61fe"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.657829 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.662563 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.672202 4962 generic.go:334] "Generic (PLEG): container finished" podID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerID="ac73c42dc924c54bc8c349e88f72de0cd595955349b0de465efb0b5629a1c596" exitCode=143 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.672351 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6","Type":"ContainerDied","Data":"ac73c42dc924c54bc8c349e88f72de0cd595955349b0de465efb0b5629a1c596"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.684961 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance2f72-account-delete-wdbvb" event={"ID":"cfeca1b1-fa87-4490-9e99-38e60d421138","Type":"ContainerStarted","Data":"c8626f8f86f5035f420dc9fa6d84b13cad4be61ca2d245dca16886f08955bc63"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.695950 4962 generic.go:334] "Generic (PLEG): container finished" podID="75210a15-c36f-4be9-9709-ceb4eb2c4646" containerID="03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9" exitCode=137 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.696174 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.702054 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance2f72-account-delete-wdbvb" podStartSLOduration=3.702040117 podStartE2EDuration="3.702040117s" podCreationTimestamp="2025-10-03 13:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:16:32.700528436 +0000 UTC m=+1601.104426261" watchObservedRunningTime="2025-10-03 13:16:32.702040117 +0000 UTC m=+1601.105937952" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.709568 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.710988 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-94dnt" event={"ID":"eb190059-74a6-4ffe-88a4-5fcfd46812a0","Type":"ContainerDied","Data":"479934aec1053530e9faa0a7258b4f7a2080621861a3348a32af527c06535648"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.713900 4962 scope.go:117] "RemoveContainer" containerID="d2fb6e730baadf5cce5c8d3a7e70507b921c0b75296b297850b5803cf0722b8a" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.725187 4962 generic.go:334] "Generic (PLEG): container finished" podID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerID="86b2ecabe6ed78973d278885e95f62a78704e8d2b70f094254e52932b6e6c618" exitCode=143 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.725282 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" event={"ID":"3c111271-43ed-48b3-b6ed-a6d02efb9113","Type":"ContainerDied","Data":"86b2ecabe6ed78973d278885e95f62a78704e8d2b70f094254e52932b6e6c618"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.734483 4962 generic.go:334] "Generic (PLEG): container finished" podID="b0da1427-1e89-42d6-beb2-55f292945177" containerID="b27da2f01290ecf61072efad87218e000a6819ad0aac516d4d56189f22787d6c" exitCode=143 Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.734588 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b0da1427-1e89-42d6-beb2-55f292945177","Type":"ContainerDied","Data":"b27da2f01290ecf61072efad87218e000a6819ad0aac516d4d56189f22787d6c"} Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.734611 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.734673 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.800224 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi6963-account-delete-mg78g"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.845133 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "75210a15-c36f-4be9-9709-ceb4eb2c4646" (UID: "75210a15-c36f-4be9-9709-ceb4eb2c4646"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.866844 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.873104 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.873248 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data podName:221bdd26-0fec-49e5-86ec-c2aefe7a5902 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:36.873218565 +0000 UTC m=+1605.277116400 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data") pod "rabbitmq-cell1-server-0" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902") : configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.914743 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.920857 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-68b6c975-4cb8j" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.158:8080/healthcheck\": dial tcp 10.217.0.158:8080: connect: connection refused" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.920949 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-68b6c975-4cb8j" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.158:8080/healthcheck\": dial tcp 10.217.0.158:8080: connect: connection refused" Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.937088 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.950822 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75210a15-c36f-4be9-9709-ceb4eb2c4646" (UID: "75210a15-c36f-4be9-9709-ceb4eb2c4646"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.951721 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "75210a15-c36f-4be9-9709-ceb4eb2c4646" (UID: "75210a15-c36f-4be9-9709-ceb4eb2c4646"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.966893 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 13:16:32 crc kubenswrapper[4962]: E1003 13:16:32.966959 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="ovn-northd" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.980864 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:32 crc kubenswrapper[4962]: I1003 13:16:32.980904 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75210a15-c36f-4be9-9709-ceb4eb2c4646-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.065492 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "009b2959-1113-4574-a2ec-90bbe2d8f8ef" (UID: "009b2959-1113-4574-a2ec-90bbe2d8f8ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.083431 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.083528 4962 configmap.go:193] Couldn't get configMap openstack/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.083586 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts podName:6d6f62dd-0720-46b6-b0a8-497490f052a8 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:37.083567065 +0000 UTC m=+1605.487464900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts") pod "ovn-controller-6sqdm" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8") : configmap "ovncontroller-scripts" not found Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.093459 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data" (OuterVolumeSpecName: "config-data") pod "009b2959-1113-4574-a2ec-90bbe2d8f8ef" (UID: "009b2959-1113-4574-a2ec-90bbe2d8f8ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.191783 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009b2959-1113-4574-a2ec-90bbe2d8f8ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.372601 4962 scope.go:117] "RemoveContainer" containerID="40661f3cc34a0a76e5aee737f7eb31eca4d4e1e703df5b2bf55b0cef327c7f85" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.398917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.398994 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.399680 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.399740 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:37.399725576 +0000 UTC m=+1605.803623411 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-config-data" not found Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.403394 4962 projected.go:194] Error preparing data for projected volume kube-api-access-9bx2f for pod openstack/barbican-api-7c44799d88-mmmm6: failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.403469 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:37.403450376 +0000 UTC m=+1605.807348211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9bx2f" (UniqueName: "kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.459646 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.479786 4962 scope.go:117] "RemoveContainer" containerID="ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.497679 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.500238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-internal-tls-certs\") pod \"dab0e7ec-9c64-491d-a655-027098042378\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.500321 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab0e7ec-9c64-491d-a655-027098042378-logs\") pod \"dab0e7ec-9c64-491d-a655-027098042378\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.500369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-combined-ca-bundle\") pod \"dab0e7ec-9c64-491d-a655-027098042378\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.500434 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-public-tls-certs\") pod \"dab0e7ec-9c64-491d-a655-027098042378\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.500466 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom\") pod \"dab0e7ec-9c64-491d-a655-027098042378\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.501138 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab0e7ec-9c64-491d-a655-027098042378-logs" (OuterVolumeSpecName: "logs") pod "dab0e7ec-9c64-491d-a655-027098042378" (UID: "dab0e7ec-9c64-491d-a655-027098042378"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.525747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dab0e7ec-9c64-491d-a655-027098042378" (UID: "dab0e7ec-9c64-491d-a655-027098042378"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.529899 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dab0e7ec-9c64-491d-a655-027098042378" (UID: "dab0e7ec-9c64-491d-a655-027098042378"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.534721 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dab0e7ec-9c64-491d-a655-027098042378" (UID: "dab0e7ec-9c64-491d-a655-027098042378"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.546912 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.562474 4962 scope.go:117] "RemoveContainer" containerID="8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.566765 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dab0e7ec-9c64-491d-a655-027098042378" (UID: "dab0e7ec-9c64-491d-a655-027098042378"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.574793 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.581409 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.601533 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-combined-ca-bundle\") pod \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.601686 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-vencrypt-tls-certs\") pod \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.602031 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56wrb\" (UniqueName: \"kubernetes.io/projected/85710c21-98fe-4148-8ef1-ec9f4e9ef311-kube-api-access-56wrb\") pod \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.602072 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-config-data\") pod \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.602098 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-nova-novncproxy-tls-certs\") pod \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\" (UID: \"85710c21-98fe-4148-8ef1-ec9f4e9ef311\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.602583 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.602601 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab0e7ec-9c64-491d-a655-027098042378-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.602616 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.602627 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.602658 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.604426 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.617129 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.619787 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-94dnt"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.624081 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85710c21-98fe-4148-8ef1-ec9f4e9ef311-kube-api-access-56wrb" (OuterVolumeSpecName: "kube-api-access-56wrb") pod "85710c21-98fe-4148-8ef1-ec9f4e9ef311" (UID: "85710c21-98fe-4148-8ef1-ec9f4e9ef311"). InnerVolumeSpecName "kube-api-access-56wrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.631381 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-94dnt"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.670878 4962 scope.go:117] "RemoveContainer" containerID="ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261" Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.671276 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261\": container with ID starting with ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261 not found: ID does not exist" containerID="ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.671334 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261"} err="failed to get container status \"ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261\": rpc error: code = NotFound desc = could not find container \"ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261\": container with ID starting with ef900e494d8a0abde750256fdbb2b7f39a5a8f037757e8bc1a381d77522fc261 not found: ID does not exist" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.671377 4962 scope.go:117] "RemoveContainer" containerID="8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.671341 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.672150 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad\": container with ID starting with 8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad not found: ID does not exist" containerID="8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.672179 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad"} err="failed to get container status \"8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad\": rpc error: code = NotFound desc = could not find container \"8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad\": container with ID starting with 8cc49fd9ef4981aeae1d009b88725c8dbebd9a3f0713241e71542b8306508bad not found: ID does not exist" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.672195 4962 scope.go:117] "RemoveContainer" containerID="03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.708093 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56wrb\" (UniqueName: \"kubernetes.io/projected/85710c21-98fe-4148-8ef1-ec9f4e9ef311-kube-api-access-56wrb\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.708189 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.708232 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data podName:0ae87940-f07d-4213-bc0b-da0b3a2bba84 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:37.708217522 +0000 UTC m=+1606.112115357 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data") pod "barbican-worker-c97f5c65f-s279k" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84") : secret "barbican-config-data" not found Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.743725 4962 scope.go:117] "RemoveContainer" containerID="03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9" Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.747254 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9\": container with ID starting with 03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9 not found: ID does not exist" containerID="03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.747388 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9"} err="failed to get container status \"03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9\": rpc error: code = NotFound desc = could not find container \"03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9\": container with ID starting with 03c2d3455408832765cd2d68dafdea03cfe1c3257309db0900840a113095f0b9 not found: ID does not exist" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.747464 4962 scope.go:117] "RemoveContainer" containerID="ad57452006db6a8d5f23250d941709e3e1778f52c203a43209011704610ba216" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.750037 4962 generic.go:334] "Generic (PLEG): container finished" podID="d36308a0-1b17-4986-adb2-2833b444a239" containerID="d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.750099 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d36308a0-1b17-4986-adb2-2833b444a239","Type":"ContainerDied","Data":"d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.751867 4962 generic.go:334] "Generic (PLEG): container finished" podID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerID="eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.751910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d888f4df-52rjc" event={"ID":"2ecb3944-c441-4879-8220-aa32d7436c1f","Type":"ContainerDied","Data":"eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.751926 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77d888f4df-52rjc" event={"ID":"2ecb3944-c441-4879-8220-aa32d7436c1f","Type":"ContainerDied","Data":"2735c0850d8792835e645f3117d147b10140f11b42d72148725ebea36cd84d25"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.751973 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77d888f4df-52rjc" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.753458 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4662-account-delete-kl2sp" event={"ID":"8e098e6f-ec3b-41e6-b179-6c196ad1fe49","Type":"ContainerStarted","Data":"a09641ede0249c5cad544c7f58a811fadfd856c6fc2863b05d5cd799ddd811c5"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.754458 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0b903-account-delete-dqf57" event={"ID":"e09f26ad-247c-477a-9d73-a2a0f8df91e8","Type":"ContainerStarted","Data":"2240cfada8ca479d27ece7a6b38dc1fd2113be8f62a10285d1e6b5342298ed72"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.754482 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0b903-account-delete-dqf57" event={"ID":"e09f26ad-247c-477a-9d73-a2a0f8df91e8","Type":"ContainerStarted","Data":"225b6cc28a0704912e8149bfafa358990703f143e731a1f8550a182b3c5d8e78"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.754684 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell0b903-account-delete-dqf57" podUID="e09f26ad-247c-477a-9d73-a2a0f8df91e8" containerName="mariadb-account-delete" containerID="cri-o://2240cfada8ca479d27ece7a6b38dc1fd2113be8f62a10285d1e6b5342298ed72" gracePeriod=30 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.760794 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": read tcp 10.217.0.2:54694->10.217.0.166:8776: read: connection reset by peer" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.765455 4962 generic.go:334] "Generic (PLEG): container finished" podID="56923e91-36c0-432d-8042-138d2e89eb3b" containerID="31aad6feef087c421df422ff28c19e306f421979c28e952513ba254522aac910" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.765514 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron8f54-account-delete-pznh9" event={"ID":"56923e91-36c0-432d-8042-138d2e89eb3b","Type":"ContainerDied","Data":"31aad6feef087c421df422ff28c19e306f421979c28e952513ba254522aac910"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.772784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" event={"ID":"e6f0fc0a-ae8e-445e-ad05-591b7ab00886","Type":"ContainerStarted","Data":"e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.781304 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c97f5c65f-s279k" event={"ID":"0ae87940-f07d-4213-bc0b-da0b3a2bba84","Type":"ContainerStarted","Data":"ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.785080 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.785358 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="ceilometer-central-agent" containerID="cri-o://5a1f2f720e928bb6b6acb1c0a85af470d0667de88a2c7df54ede75a00de60204" gracePeriod=30 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.785470 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="proxy-httpd" containerID="cri-o://cb12fdcf72cb818439ed6c57d7c01f490985bcb4351a8ce3800533ddd1e0259f" gracePeriod=30 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.785525 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="sg-core" containerID="cri-o://f257bc79eebd262ca3fa0048575136a5f530237c8f848c61fcbe3df34711993b" gracePeriod=30 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.785568 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="ceilometer-notification-agent" containerID="cri-o://0daf5bb19acb882ee4245cd2958c71e3acf61abcdee9f27764c1a937ef9e54d3" gracePeriod=30 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.790207 4962 generic.go:334] "Generic (PLEG): container finished" podID="85710c21-98fe-4148-8ef1-ec9f4e9ef311" containerID="cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.790290 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"85710c21-98fe-4148-8ef1-ec9f4e9ef311","Type":"ContainerDied","Data":"cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.790318 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"85710c21-98fe-4148-8ef1-ec9f4e9ef311","Type":"ContainerDied","Data":"806dd82677fa3c9d4b0fca14d063230ffd0dd47696756580d168858f6efe8e8c"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.790405 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.793689 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell0b903-account-delete-dqf57" podStartSLOduration=4.793672997 podStartE2EDuration="4.793672997s" podCreationTimestamp="2025-10-03 13:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:16:33.792014033 +0000 UTC m=+1602.195911868" watchObservedRunningTime="2025-10-03 13:16:33.793672997 +0000 UTC m=+1602.197570832" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818196 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgw4z\" (UniqueName: \"kubernetes.io/projected/2ecb3944-c441-4879-8220-aa32d7436c1f-kube-api-access-bgw4z\") pod \"2ecb3944-c441-4879-8220-aa32d7436c1f\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-combined-ca-bundle\") pod \"2ecb3944-c441-4879-8220-aa32d7436c1f\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818269 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-kolla-config\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818289 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data-custom\") pod \"2ecb3944-c441-4879-8220-aa32d7436c1f\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818309 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data\") pod \"2ecb3944-c441-4879-8220-aa32d7436c1f\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818389 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-galera-tls-certs\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818434 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-config-data-default\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-operator-scripts\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818509 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-secrets\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818528 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818595 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/438da193-7b02-4101-a45c-9e0f83c41051-config-data-generated\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818623 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvjbg\" (UniqueName: \"kubernetes.io/projected/438da193-7b02-4101-a45c-9e0f83c41051-kube-api-access-vvjbg\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-combined-ca-bundle\") pod \"438da193-7b02-4101-a45c-9e0f83c41051\" (UID: \"438da193-7b02-4101-a45c-9e0f83c41051\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.818736 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ecb3944-c441-4879-8220-aa32d7436c1f-logs\") pod \"2ecb3944-c441-4879-8220-aa32d7436c1f\" (UID: \"2ecb3944-c441-4879-8220-aa32d7436c1f\") " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.821320 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ecb3944-c441-4879-8220-aa32d7436c1f-logs" (OuterVolumeSpecName: "logs") pod "2ecb3944-c441-4879-8220-aa32d7436c1f" (UID: "2ecb3944-c441-4879-8220-aa32d7436c1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.823288 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.823521 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438da193-7b02-4101-a45c-9e0f83c41051-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.828172 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.833857 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.844231 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2ecb3944-c441-4879-8220-aa32d7436c1f" (UID: "2ecb3944-c441-4879-8220-aa32d7436c1f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.844773 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-secrets" (OuterVolumeSpecName: "secrets") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.847214 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ecb3944-c441-4879-8220-aa32d7436c1f-kube-api-access-bgw4z" (OuterVolumeSpecName: "kube-api-access-bgw4z") pod "2ecb3944-c441-4879-8220-aa32d7436c1f" (UID: "2ecb3944-c441-4879-8220-aa32d7436c1f"). InnerVolumeSpecName "kube-api-access-bgw4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.847857 4962 generic.go:334] "Generic (PLEG): container finished" podID="cfeca1b1-fa87-4490-9e99-38e60d421138" containerID="cbf328789db561021439da2a08d5d4001980c89f6b0a094a5fc6f703782ccc9f" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.847937 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance2f72-account-delete-wdbvb" event={"ID":"cfeca1b1-fa87-4490-9e99-38e60d421138","Type":"ContainerDied","Data":"cbf328789db561021439da2a08d5d4001980c89f6b0a094a5fc6f703782ccc9f"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.853330 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438da193-7b02-4101-a45c-9e0f83c41051-kube-api-access-vvjbg" (OuterVolumeSpecName: "kube-api-access-vvjbg") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "kube-api-access-vvjbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.871232 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.871330 4962 generic.go:334] "Generic (PLEG): container finished" podID="438da193-7b02-4101-a45c-9e0f83c41051" containerID="975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.871462 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.871520 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"438da193-7b02-4101-a45c-9e0f83c41051","Type":"ContainerDied","Data":"975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.871542 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"438da193-7b02-4101-a45c-9e0f83c41051","Type":"ContainerDied","Data":"c9f6d0d07695070d1b3ee0e356d4ee2c319299ae22d961918c739d6bca0a5f50"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920575 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ecb3944-c441-4879-8220-aa32d7436c1f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920601 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgw4z\" (UniqueName: \"kubernetes.io/projected/2ecb3944-c441-4879-8220-aa32d7436c1f-kube-api-access-bgw4z\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920611 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920621 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920645 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920654 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/438da193-7b02-4101-a45c-9e0f83c41051-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920662 4962 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920682 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920691 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/438da193-7b02-4101-a45c-9e0f83c41051-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.920700 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvjbg\" (UniqueName: \"kubernetes.io/projected/438da193-7b02-4101-a45c-9e0f83c41051-kube-api-access-vvjbg\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.922312 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.922606 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="72df0792-9904-4b64-9c70-37cb982fe24b" containerName="kube-state-metrics" containerID="cri-o://46f6912f8f25e900b01e11da327d2482ee9d18c20a5ec8e5af15a90619d45612" gracePeriod=30 Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.929405 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.929491 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:37.929469075 +0000 UTC m=+1606.333366910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-config-data" not found Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.929566 4962 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:33 crc kubenswrapper[4962]: E1003 13:16:33.929599 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:37.929589828 +0000 UTC m=+1606.333487673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.929917 4962 generic.go:334] "Generic (PLEG): container finished" podID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerID="036194e68dbb515945afa3ad089ad8f4474610c770e29c2e3ac03647eae66d7d" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.929941 4962 generic.go:334] "Generic (PLEG): container finished" podID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerID="fde2842fbb361eeef96c1f37f5ca7accd441dd0d4530fb4fc8d6c4be4392db6f" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.930016 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b6c975-4cb8j" event={"ID":"05f2e935-e9b5-49ab-8a2a-30b15840bae9","Type":"ContainerDied","Data":"036194e68dbb515945afa3ad089ad8f4474610c770e29c2e3ac03647eae66d7d"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.930047 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b6c975-4cb8j" event={"ID":"05f2e935-e9b5-49ab-8a2a-30b15840bae9","Type":"ContainerDied","Data":"fde2842fbb361eeef96c1f37f5ca7accd441dd0d4530fb4fc8d6c4be4392db6f"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.932842 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi6963-account-delete-mg78g" event={"ID":"1b763061-bb23-4c23-a4ec-bebac231c603","Type":"ContainerStarted","Data":"0a87238a1f9f9b006b3fcededc25d64aae5534de53bb9a57f44dd86afd263a1c"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.940143 4962 generic.go:334] "Generic (PLEG): container finished" podID="b42a368b-6dd4-4bb0-83a8-d79138605ec9" containerID="c5d4d124b101cf04b221f4e6263424d582db30137ac6b55d395398aca8b14c29" exitCode=0 Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.941624 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.940810 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement453e-account-delete-v68p5" event={"ID":"b42a368b-6dd4-4bb0-83a8-d79138605ec9","Type":"ContainerDied","Data":"c5d4d124b101cf04b221f4e6263424d582db30137ac6b55d395398aca8b14c29"} Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.942207 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 03 13:16:33 crc kubenswrapper[4962]: I1003 13:16:33.942959 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="32e6592a-d206-4931-aa99-a84e041b05e4" containerName="memcached" containerID="cri-o://016f210b8c91d8e9c5f93eeed437c69115693182129858f3582cdcd912c4dd79" gracePeriod=30 Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.012886 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xgdv7"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.034783 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xgdv7"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.046042 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4fvqx"] Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.046693 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0ce8b4a4ac9d0b8cc21e8feaa51be48f8d9e21fc13fb4c98b22efe982cb9565b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.053401 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0ce8b4a4ac9d0b8cc21e8feaa51be48f8d9e21fc13fb4c98b22efe982cb9565b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.053551 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4fvqx"] Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.058707 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0ce8b4a4ac9d0b8cc21e8feaa51be48f8d9e21fc13fb4c98b22efe982cb9565b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.058751 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="d22955d6-a957-458f-8181-5fea18cedc90" containerName="nova-cell1-conductor-conductor" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.061106 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-58d67d97c8-pnjp8"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.061394 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-58d67d97c8-pnjp8" podUID="a6cba65d-0ae5-4a81-88c1-da4e07d7a803" containerName="keystone-api" containerID="cri-o://3108a39b0723e787d2db8b185df6254591ba7ffd8691d08c951e273ac8405e51" gracePeriod=30 Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.075914 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.082767 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hxdfm"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.092736 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hxdfm"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.100657 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2734-account-create-kbjj5"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.107068 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2734-account-create-kbjj5"] Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.247298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-config-data" (OuterVolumeSpecName: "config-data") pod "85710c21-98fe-4148-8ef1-ec9f4e9ef311" (UID: "85710c21-98fe-4148-8ef1-ec9f4e9ef311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.247687 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" path="/var/lib/kubelet/pods/009b2959-1113-4574-a2ec-90bbe2d8f8ef/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.249368 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" path="/var/lib/kubelet/pods/2af174c7-cf23-452c-bc13-ecda2775d58d/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.250408 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" path="/var/lib/kubelet/pods/6313803e-1bf1-4a99-8af7-cb80c0e6321c/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.253051 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b77bc9-27ae-4994-86c2-614e48ad33c6" path="/var/lib/kubelet/pods/67b77bc9-27ae-4994-86c2-614e48ad33c6/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.259166 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb7b0d2-9599-4942-90e3-5bfb712724e0" path="/var/lib/kubelet/pods/6bb7b0d2-9599-4942-90e3-5bfb712724e0/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.261077 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75210a15-c36f-4be9-9709-ceb4eb2c4646" path="/var/lib/kubelet/pods/75210a15-c36f-4be9-9709-ceb4eb2c4646/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.261592 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693" path="/var/lib/kubelet/pods/8e1d50ce-6e3a-4c9f-ae0a-37e698dcc693/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.262223 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79e40bc-548c-44b8-8457-c0b8195c6436" path="/var/lib/kubelet/pods/a79e40bc-548c-44b8-8457-c0b8195c6436/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.263143 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3133c1f-476d-440b-977a-9642e1284622" path="/var/lib/kubelet/pods/b3133c1f-476d-440b-977a-9642e1284622/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.264281 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" path="/var/lib/kubelet/pods/eb190059-74a6-4ffe-88a4-5fcfd46812a0/volumes" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.278854 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ecb3944-c441-4879-8220-aa32d7436c1f" (UID: "2ecb3944-c441-4879-8220-aa32d7436c1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.340571 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.342666 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.342707 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.342728 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.343888 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.344012 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data podName:862ad9df-af58-4304-9ad5-7faba334e2d9 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:38.343946118 +0000 UTC m=+1606.747843963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data") pod "rabbitmq-server-0" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9") : configmap "rabbitmq-config-data" not found Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.425009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85710c21-98fe-4148-8ef1-ec9f4e9ef311" (UID: "85710c21-98fe-4148-8ef1-ec9f4e9ef311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.447012 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.499752 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "85710c21-98fe-4148-8ef1-ec9f4e9ef311" (UID: "85710c21-98fe-4148-8ef1-ec9f4e9ef311"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.505399 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.536085 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "85710c21-98fe-4148-8ef1-ec9f4e9ef311" (UID: "85710c21-98fe-4148-8ef1-ec9f4e9ef311"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.549290 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data" (OuterVolumeSpecName: "config-data") pod "2ecb3944-c441-4879-8220-aa32d7436c1f" (UID: "2ecb3944-c441-4879-8220-aa32d7436c1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.549527 4962 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.549554 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.549566 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ecb3944-c441-4879-8220-aa32d7436c1f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.549575 4962 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/85710c21-98fe-4148-8ef1-ec9f4e9ef311-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.563013 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "438da193-7b02-4101-a45c-9e0f83c41051" (UID: "438da193-7b02-4101-a45c-9e0f83c41051"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.615988 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:39262->10.217.0.200:8775: read: connection reset by peer" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.616334 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:39250->10.217.0.200:8775: read: connection reset by peer" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.651368 4962 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/438da193-7b02-4101-a45c-9e0f83c41051-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.726553 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="a3fb0456-394e-4041-829b-57c162966b2b" containerName="galera" containerID="cri-o://b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac" gracePeriod=30 Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.814133 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3 is running failed: container process not found" containerID="d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.818001 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3 is running failed: container process not found" containerID="d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.822969 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3 is running failed: container process not found" containerID="d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 13:16:34 crc kubenswrapper[4962]: E1003 13:16:34.823001 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d36308a0-1b17-4986-adb2-2833b444a239" containerName="nova-scheduler-scheduler" Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.971657 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi6963-account-delete-mg78g" podUID="1b763061-bb23-4c23-a4ec-bebac231c603" containerName="mariadb-account-delete" containerID="cri-o://a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d" gracePeriod=30 Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.989774 4962 generic.go:334] "Generic (PLEG): container finished" podID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerID="b92f6a632cc9c0dff9f450965eee31724d28f079d91c0b8852b080e2ed919e29" exitCode=0 Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.992354 4962 generic.go:334] "Generic (PLEG): container finished" podID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerID="7377486e2b45b77a0115044df6ada84243926afe00f2d9d4d7481b7f35b3b3cd" exitCode=0 Oct 03 13:16:34 crc kubenswrapper[4962]: I1003 13:16:34.995351 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi6963-account-delete-mg78g" podStartSLOduration=5.995332784 podStartE2EDuration="5.995332784s" podCreationTimestamp="2025-10-03 13:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:16:34.986930259 +0000 UTC m=+1603.390828114" watchObservedRunningTime="2025-10-03 13:16:34.995332784 +0000 UTC m=+1603.399230619" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.004972 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-c97f5c65f-s279k" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerName="barbican-worker-log" containerID="cri-o://ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4" gracePeriod=30 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.005474 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-c97f5c65f-s279k" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerName="barbican-worker" containerID="cri-o://442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37" gracePeriod=30 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.018036 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerName="barbican-keystone-listener-log" containerID="cri-o://e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73" gracePeriod=30 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.018072 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerName="barbican-keystone-listener" containerID="cri-o://7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a" gracePeriod=30 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.031855 4962 generic.go:334] "Generic (PLEG): container finished" podID="b0da1427-1e89-42d6-beb2-55f292945177" containerID="c5177f0f305f7d8efd50064ca1ae9320ecca80819662d7f114a735ed39509584" exitCode=0 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.033676 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-c97f5c65f-s279k" podStartSLOduration=7.033655754 podStartE2EDuration="7.033655754s" podCreationTimestamp="2025-10-03 13:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:16:35.023236984 +0000 UTC m=+1603.427134839" watchObservedRunningTime="2025-10-03 13:16:35.033655754 +0000 UTC m=+1603.437553589" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.044802 4962 generic.go:334] "Generic (PLEG): container finished" podID="72df0792-9904-4b64-9c70-37cb982fe24b" containerID="46f6912f8f25e900b01e11da327d2482ee9d18c20a5ec8e5af15a90619d45612" exitCode=2 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.049253 4962 generic.go:334] "Generic (PLEG): container finished" podID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerID="cb12fdcf72cb818439ed6c57d7c01f490985bcb4351a8ce3800533ddd1e0259f" exitCode=0 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.049304 4962 generic.go:334] "Generic (PLEG): container finished" podID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerID="f257bc79eebd262ca3fa0048575136a5f530237c8f848c61fcbe3df34711993b" exitCode=2 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.049312 4962 generic.go:334] "Generic (PLEG): container finished" podID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerID="5a1f2f720e928bb6b6acb1c0a85af470d0667de88a2c7df54ede75a00de60204" exitCode=0 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.053150 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" podStartSLOduration=6.053131247 podStartE2EDuration="6.053131247s" podCreationTimestamp="2025-10-03 13:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 13:16:35.046394306 +0000 UTC m=+1603.450292131" watchObservedRunningTime="2025-10-03 13:16:35.053131247 +0000 UTC m=+1603.457029082" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.055780 4962 generic.go:334] "Generic (PLEG): container finished" podID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerID="5338fa6e89e4bd15f9e4db940f00a7a5b8fdf7c12ab81b1de2fe6747f81ea20d" exitCode=0 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.070461 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e098e6f-ec3b-41e6-b179-6c196ad1fe49" containerID="3c734b51211f1947d4f13bde1f8348ca138c6ca1d3cd8d9c8095f5f48ccbdbd0" exitCode=0 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.122115 4962 generic.go:334] "Generic (PLEG): container finished" podID="d329c4da-aa05-4c80-ab30-622eac56428a" containerID="05b9f6f45511688aab50c27ba2558d43869979d7395ad2e599ba9c06301661e8" exitCode=0 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.125572 4962 generic.go:334] "Generic (PLEG): container finished" podID="e09f26ad-247c-477a-9d73-a2a0f8df91e8" containerID="2240cfada8ca479d27ece7a6b38dc1fd2113be8f62a10285d1e6b5342298ed72" exitCode=0 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.143053 4962 generic.go:334] "Generic (PLEG): container finished" podID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerID="3e39d9d9f9ef98752b8331bca21aeea43f0d1658741d17c5c7ad0d95aa684075" exitCode=0 Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.382829 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7fd754f4-rx9k9" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.152:9311/healthcheck\": dial tcp 10.217.0.152:9311: connect: connection refused" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.382832 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7fd754f4-rx9k9" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.152:9311/healthcheck\": dial tcp 10.217.0.152:9311: connect: connection refused" Oct 03 13:16:35 crc kubenswrapper[4962]: E1003 13:16:35.642042 4962 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.416s" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642074 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-shsr9"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642098 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi6963-account-delete-mg78g" event={"ID":"1b763061-bb23-4c23-a4ec-bebac231c603","Type":"ContainerStarted","Data":"a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642150 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6456949cf6-r4n9q" event={"ID":"1289d443-56d2-4f63-8802-66bcd0569b3b","Type":"ContainerDied","Data":"b92f6a632cc9c0dff9f450965eee31724d28f079d91c0b8852b080e2ed919e29"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6456949cf6-r4n9q" event={"ID":"1289d443-56d2-4f63-8802-66bcd0569b3b","Type":"ContainerDied","Data":"d91e78e7691d71bfdbb7e92f17ddbf0c1bacccfe253711a3a3db928ce9630620"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642177 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d91e78e7691d71bfdbb7e92f17ddbf0c1bacccfe253711a3a3db928ce9630620" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642195 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-shsr9"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642210 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement453e-account-delete-v68p5"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642220 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6","Type":"ContainerDied","Data":"7377486e2b45b77a0115044df6ada84243926afe00f2d9d4d7481b7f35b3b3cd"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642232 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d36308a0-1b17-4986-adb2-2833b444a239","Type":"ContainerDied","Data":"6af4999c095bb3a40cc8d568790a2ef5ff6b6f5d269b92f1cd85d0e38a47af5d"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642242 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6af4999c095bb3a40cc8d568790a2ef5ff6b6f5d269b92f1cd85d0e38a47af5d" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642252 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b6c975-4cb8j" event={"ID":"05f2e935-e9b5-49ab-8a2a-30b15840bae9","Type":"ContainerDied","Data":"cac01411963bfa7be3e10db04acc0c3eea2c2ef096b36b436336abeb75d82df1"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642261 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cac01411963bfa7be3e10db04acc0c3eea2c2ef096b36b436336abeb75d82df1" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642269 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-453e-account-create-krqfz"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642282 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-453e-account-create-krqfz"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642292 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-htqpm"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642302 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-htqpm"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642312 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c97f5c65f-s279k" event={"ID":"0ae87940-f07d-4213-bc0b-da0b3a2bba84","Type":"ContainerStarted","Data":"442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642323 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2f72-account-create-zwf4b"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" event={"ID":"e6f0fc0a-ae8e-445e-ad05-591b7ab00886","Type":"ContainerStarted","Data":"7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642343 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b0da1427-1e89-42d6-beb2-55f292945177","Type":"ContainerDied","Data":"c5177f0f305f7d8efd50064ca1ae9320ecca80819662d7f114a735ed39509584"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642356 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance2f72-account-delete-wdbvb"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642367 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2f72-account-create-zwf4b"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642379 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"72df0792-9904-4b64-9c70-37cb982fe24b","Type":"ContainerDied","Data":"46f6912f8f25e900b01e11da327d2482ee9d18c20a5ec8e5af15a90619d45612"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"72df0792-9904-4b64-9c70-37cb982fe24b","Type":"ContainerDied","Data":"9d94f0c4c9cc4d34a04158bb731745ee950f7d1d6523896dd0e51db3ffc6ff97"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642397 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d94f0c4c9cc4d34a04158bb731745ee950f7d1d6523896dd0e51db3ffc6ff97" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642405 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron8f54-account-delete-pznh9"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642416 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8f54-account-create-c5m24"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642425 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerDied","Data":"cb12fdcf72cb818439ed6c57d7c01f490985bcb4351a8ce3800533ddd1e0259f"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642436 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mkw7j"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642448 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8f54-account-create-c5m24"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642458 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerDied","Data":"f257bc79eebd262ca3fa0048575136a5f530237c8f848c61fcbe3df34711993b"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642467 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerDied","Data":"5a1f2f720e928bb6b6acb1c0a85af470d0667de88a2c7df54ede75a00de60204"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642476 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6ae29e17-1d99-4401-a317-9c8b7be58a3c","Type":"ContainerDied","Data":"5338fa6e89e4bd15f9e4db940f00a7a5b8fdf7c12ab81b1de2fe6747f81ea20d"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642488 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6ae29e17-1d99-4401-a317-9c8b7be58a3c","Type":"ContainerDied","Data":"5127db17298303c9fb33fa13540988e6f53ff4b60ecb93d3188ebd7aa931eadc"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642495 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5127db17298303c9fb33fa13540988e6f53ff4b60ecb93d3188ebd7aa931eadc" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642502 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4662-account-delete-kl2sp" event={"ID":"8e098e6f-ec3b-41e6-b179-6c196ad1fe49","Type":"ContainerDied","Data":"3c734b51211f1947d4f13bde1f8348ca138c6ca1d3cd8d9c8095f5f48ccbdbd0"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d329c4da-aa05-4c80-ab30-622eac56428a","Type":"ContainerDied","Data":"05b9f6f45511688aab50c27ba2558d43869979d7395ad2e599ba9c06301661e8"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0b903-account-delete-dqf57" event={"ID":"e09f26ad-247c-477a-9d73-a2a0f8df91e8","Type":"ContainerDied","Data":"2240cfada8ca479d27ece7a6b38dc1fd2113be8f62a10285d1e6b5342298ed72"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642540 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0b903-account-delete-dqf57" event={"ID":"e09f26ad-247c-477a-9d73-a2a0f8df91e8","Type":"ContainerDied","Data":"225b6cc28a0704912e8149bfafa358990703f143e731a1f8550a182b3c5d8e78"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642548 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="225b6cc28a0704912e8149bfafa358990703f143e731a1f8550a182b3c5d8e78" Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642556 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd269d6d-5aa2-43c0-a23b-e76b52699d59","Type":"ContainerDied","Data":"3e39d9d9f9ef98752b8331bca21aeea43f0d1658741d17c5c7ad0d95aa684075"} Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.642568 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mkw7j"] Oct 03 13:16:35 crc kubenswrapper[4962]: I1003 13:16:35.688793 4962 scope.go:117] "RemoveContainer" containerID="0be4d75f8ba790fbc1d9607dd70ea29bd8aa5bfdf1d636b6a55c54afecb20827" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.025790 4962 scope.go:117] "RemoveContainer" containerID="eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.033879 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.072963 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09f26ad_247c_477a_9d73_a2a0f8df91e8.slice/crio-2240cfada8ca479d27ece7a6b38dc1fd2113be8f62a10285d1e6b5342298ed72.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab0e7ec_9c64_491d_a655_027098042378.slice\": RecentStats: unable to find data in memory cache]" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.159577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd269d6d-5aa2-43c0-a23b-e76b52699d59","Type":"ContainerDied","Data":"066b514190e18f43574fae09b13806ce1c06fb24598a4d1db574de6345243bdc"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.160022 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="066b514190e18f43574fae09b13806ce1c06fb24598a4d1db574de6345243bdc" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.168726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d329c4da-aa05-4c80-ab30-622eac56428a","Type":"ContainerDied","Data":"b2f488bcb242ea953b34e7a282f39d58dc3b72f8a69993c23384f95155a5300f"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.168759 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2f488bcb242ea953b34e7a282f39d58dc3b72f8a69993c23384f95155a5300f" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.171674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement453e-account-delete-v68p5" event={"ID":"b42a368b-6dd4-4bb0-83a8-d79138605ec9","Type":"ContainerDied","Data":"8cc993a8ce67c8187a388f922b918d9316e02673b97d3cbfb45dea600eb87635"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.171732 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cc993a8ce67c8187a388f922b918d9316e02673b97d3cbfb45dea600eb87635" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.173804 4962 generic.go:334] "Generic (PLEG): container finished" podID="32e6592a-d206-4931-aa99-a84e041b05e4" containerID="016f210b8c91d8e9c5f93eeed437c69115693182129858f3582cdcd912c4dd79" exitCode=0 Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.173945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"32e6592a-d206-4931-aa99-a84e041b05e4","Type":"ContainerDied","Data":"016f210b8c91d8e9c5f93eeed437c69115693182129858f3582cdcd912c4dd79"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.173994 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"32e6592a-d206-4931-aa99-a84e041b05e4","Type":"ContainerDied","Data":"1a79561749c5439f97032e42cf2641ba2b025f913febe3f8c59942dab013b58b"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.174007 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a79561749c5439f97032e42cf2641ba2b025f913febe3f8c59942dab013b58b" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.177816 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4662-account-delete-kl2sp" event={"ID":"8e098e6f-ec3b-41e6-b179-6c196ad1fe49","Type":"ContainerDied","Data":"a09641ede0249c5cad544c7f58a811fadfd856c6fc2863b05d5cd799ddd811c5"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.177925 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09641ede0249c5cad544c7f58a811fadfd856c6fc2863b05d5cd799ddd811c5" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.179734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6","Type":"ContainerDied","Data":"0c2c25b66b133fc697df6049c4530b5fe754ab1804e372662ca5bb9256497054"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.179765 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2c25b66b133fc697df6049c4530b5fe754ab1804e372662ca5bb9256497054" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.197031 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.197499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-config-data\") pod \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.197613 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-run-httpd\") pod \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.197661 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-combined-ca-bundle\") pod \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.197755 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-public-tls-certs\") pod \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.197797 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzmlv\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-kube-api-access-wzmlv\") pod \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.199064 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-internal-tls-certs\") pod \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.199291 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-etc-swift\") pod \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.199343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-log-httpd\") pod \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\" (UID: \"05f2e935-e9b5-49ab-8a2a-30b15840bae9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.199599 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "05f2e935-e9b5-49ab-8a2a-30b15840bae9" (UID: "05f2e935-e9b5-49ab-8a2a-30b15840bae9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.200326 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "05f2e935-e9b5-49ab-8a2a-30b15840bae9" (UID: "05f2e935-e9b5-49ab-8a2a-30b15840bae9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.201191 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.201310 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f2e935-e9b5-49ab-8a2a-30b15840bae9-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.202171 4962 generic.go:334] "Generic (PLEG): container finished" podID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerID="e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73" exitCode=143 Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.202261 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" event={"ID":"e6f0fc0a-ae8e-445e-ad05-591b7ab00886","Type":"ContainerDied","Data":"e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.203974 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.214035 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.215475 4962 scope.go:117] "RemoveContainer" containerID="b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.220869 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.227702 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-77d888f4df-52rjc"] Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.237979 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "05f2e935-e9b5-49ab-8a2a-30b15840bae9" (UID: "05f2e935-e9b5-49ab-8a2a-30b15840bae9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.240870 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0b903-account-delete-dqf57" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.241594 4962 generic.go:334] "Generic (PLEG): container finished" podID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerID="efdd11f8bd8386aa2fc051d59f9344ed094988bb97638532765b2b52ec56a7ba" exitCode=0 Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.242566 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-kube-api-access-wzmlv" (OuterVolumeSpecName: "kube-api-access-wzmlv") pod "05f2e935-e9b5-49ab-8a2a-30b15840bae9" (UID: "05f2e935-e9b5-49ab-8a2a-30b15840bae9"). InnerVolumeSpecName "kube-api-access-wzmlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.247542 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f4639da-c3a1-47c8-baf4-155f6d7fdc5c" path="/var/lib/kubelet/pods/1f4639da-c3a1-47c8-baf4-155f6d7fdc5c/volumes" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.248053 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e7ff309-1e7f-4954-a08b-a5cf72982735" path="/var/lib/kubelet/pods/4e7ff309-1e7f-4954-a08b-a5cf72982735/volumes" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.248592 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5" path="/var/lib/kubelet/pods/836b2b81-0dc8-4336-b9a8-2f52e8c5d4f5/volumes" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.249188 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85710c21-98fe-4148-8ef1-ec9f4e9ef311" path="/var/lib/kubelet/pods/85710c21-98fe-4148-8ef1-ec9f4e9ef311/volumes" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.250289 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d28696d-ae2e-427d-b247-6ef22dde492f" path="/var/lib/kubelet/pods/8d28696d-ae2e-427d-b247-6ef22dde492f/volumes" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.250820 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6" path="/var/lib/kubelet/pods/e0987fb4-1ae7-4caa-b2d8-4c3fba23e6d6/volumes" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.251277 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19e07aa-3097-4c52-990e-c2738163f946" path="/var/lib/kubelet/pods/e19e07aa-3097-4c52-990e-c2738163f946/volumes" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.253143 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.253256 4962 generic.go:334] "Generic (PLEG): container finished" podID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerID="31ec554c86926fa60a6d1b72601dfc4ca004a83f21bd45f79846d05688388ebf" exitCode=0 Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.289783 4962 scope.go:117] "RemoveContainer" containerID="eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.299981 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b\": container with ID starting with eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b not found: ID does not exist" containerID="eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.300082 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b"} err="failed to get container status \"eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b\": rpc error: code = NotFound desc = could not find container \"eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b\": container with ID starting with eddb0fb647a83d84a8dbf9083c9e196483cf75a4bb08e36e17e23de03fd5c34b not found: ID does not exist" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.300162 4962 scope.go:117] "RemoveContainer" containerID="b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.307332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7fd754f4-rx9k9" event={"ID":"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8","Type":"ContainerDied","Data":"efdd11f8bd8386aa2fc051d59f9344ed094988bb97638532765b2b52ec56a7ba"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.307387 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-77d888f4df-52rjc"] Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.307409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7fd754f4-rx9k9" event={"ID":"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8","Type":"ContainerDied","Data":"a9291b6ad0144fe4b4bf210a17913502b8ab701d3ea438dc75b634a55814a0af"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.307421 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9291b6ad0144fe4b4bf210a17913502b8ab701d3ea438dc75b634a55814a0af" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.307430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" event={"ID":"3c111271-43ed-48b3-b6ed-a6d02efb9113","Type":"ContainerDied","Data":"31ec554c86926fa60a6d1b72601dfc4ca004a83f21bd45f79846d05688388ebf"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.307466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" event={"ID":"3c111271-43ed-48b3-b6ed-a6d02efb9113","Type":"ContainerDied","Data":"4689223207e192d071dd8574b0bb78a7f6fedc890b8f3664d99710eb87142619"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.307477 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4689223207e192d071dd8574b0bb78a7f6fedc890b8f3664d99710eb87142619" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.311170 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xb25\" (UniqueName: \"kubernetes.io/projected/d36308a0-1b17-4986-adb2-2833b444a239-kube-api-access-5xb25\") pod \"d36308a0-1b17-4986-adb2-2833b444a239\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.311355 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-public-tls-certs\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.315925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sstsr\" (UniqueName: \"kubernetes.io/projected/6ae29e17-1d99-4401-a317-9c8b7be58a3c-kube-api-access-sstsr\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.315967 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-config-data\") pod \"d36308a0-1b17-4986-adb2-2833b444a239\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.316112 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ae29e17-1d99-4401-a317-9c8b7be58a3c-etc-machine-id\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.316163 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-combined-ca-bundle\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.316200 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data-custom\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.316260 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.316483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ae29e17-1d99-4401-a317-9c8b7be58a3c-logs\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.316580 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-internal-tls-certs\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.316674 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-scripts\") pod \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\" (UID: \"6ae29e17-1d99-4401-a317-9c8b7be58a3c\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.316809 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-combined-ca-bundle\") pod \"d36308a0-1b17-4986-adb2-2833b444a239\" (UID: \"d36308a0-1b17-4986-adb2-2833b444a239\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.318486 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.320071 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzmlv\" (UniqueName: \"kubernetes.io/projected/05f2e935-e9b5-49ab-8a2a-30b15840bae9-kube-api-access-wzmlv\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.325198 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.325805 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59\": container with ID starting with b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59 not found: ID does not exist" containerID="b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.325846 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59"} err="failed to get container status \"b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59\": rpc error: code = NotFound desc = could not find container \"b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59\": container with ID starting with b5849a5ee0085259fce2aeb3f161a304d34082d7cf3935fa4904ae128a9bdf59 not found: ID does not exist" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.325909 4962 scope.go:117] "RemoveContainer" containerID="cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.326127 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b0da1427-1e89-42d6-beb2-55f292945177","Type":"ContainerDied","Data":"f2a67da5b67156e26f78a02bc4d28c3835e8ad9566e2ba809f777e9f3545c5c8"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.326170 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2a67da5b67156e26f78a02bc4d28c3835e8ad9566e2ba809f777e9f3545c5c8" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.338343 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ae29e17-1d99-4401-a317-9c8b7be58a3c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.338535 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae29e17-1d99-4401-a317-9c8b7be58a3c-logs" (OuterVolumeSpecName: "logs") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.365380 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.394178 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.394566 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance2f72-account-delete-wdbvb" event={"ID":"cfeca1b1-fa87-4490-9e99-38e60d421138","Type":"ContainerDied","Data":"c8626f8f86f5035f420dc9fa6d84b13cad4be61ca2d245dca16886f08955bc63"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.394598 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8626f8f86f5035f420dc9fa6d84b13cad4be61ca2d245dca16886f08955bc63" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.394770 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.394921 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421315 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-config-data\") pod \"1289d443-56d2-4f63-8802-66bcd0569b3b\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421427 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-internal-tls-certs\") pod \"d329c4da-aa05-4c80-ab30-622eac56428a\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421502 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-combined-ca-bundle\") pod \"72df0792-9904-4b64-9c70-37cb982fe24b\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421524 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-certs\") pod \"72df0792-9904-4b64-9c70-37cb982fe24b\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421548 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-scripts\") pod \"1289d443-56d2-4f63-8802-66bcd0569b3b\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421625 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-public-tls-certs\") pod \"1289d443-56d2-4f63-8802-66bcd0569b3b\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421708 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvhjw\" (UniqueName: \"kubernetes.io/projected/1289d443-56d2-4f63-8802-66bcd0569b3b-kube-api-access-pvhjw\") pod \"1289d443-56d2-4f63-8802-66bcd0569b3b\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421733 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d443-56d2-4f63-8802-66bcd0569b3b-logs\") pod \"1289d443-56d2-4f63-8802-66bcd0569b3b\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421789 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-combined-ca-bundle\") pod \"d329c4da-aa05-4c80-ab30-622eac56428a\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421825 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-config\") pod \"72df0792-9904-4b64-9c70-37cb982fe24b\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421850 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd26c\" (UniqueName: \"kubernetes.io/projected/e09f26ad-247c-477a-9d73-a2a0f8df91e8-kube-api-access-hd26c\") pod \"e09f26ad-247c-477a-9d73-a2a0f8df91e8\" (UID: \"e09f26ad-247c-477a-9d73-a2a0f8df91e8\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421869 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-internal-tls-certs\") pod \"1289d443-56d2-4f63-8802-66bcd0569b3b\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.421905 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-config-data\") pod \"d329c4da-aa05-4c80-ab30-622eac56428a\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.423238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjthg\" (UniqueName: \"kubernetes.io/projected/d329c4da-aa05-4c80-ab30-622eac56428a-kube-api-access-xjthg\") pod \"d329c4da-aa05-4c80-ab30-622eac56428a\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.423277 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-public-tls-certs\") pod \"d329c4da-aa05-4c80-ab30-622eac56428a\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.423299 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d329c4da-aa05-4c80-ab30-622eac56428a-logs\") pod \"d329c4da-aa05-4c80-ab30-622eac56428a\" (UID: \"d329c4da-aa05-4c80-ab30-622eac56428a\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.423340 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-combined-ca-bundle\") pod \"1289d443-56d2-4f63-8802-66bcd0569b3b\" (UID: \"1289d443-56d2-4f63-8802-66bcd0569b3b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.423420 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srwdq\" (UniqueName: \"kubernetes.io/projected/72df0792-9904-4b64-9c70-37cb982fe24b-kube-api-access-srwdq\") pod \"72df0792-9904-4b64-9c70-37cb982fe24b\" (UID: \"72df0792-9904-4b64-9c70-37cb982fe24b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.424163 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ae29e17-1d99-4401-a317-9c8b7be58a3c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.424179 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ae29e17-1d99-4401-a317-9c8b7be58a3c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.450970 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36308a0-1b17-4986-adb2-2833b444a239-kube-api-access-5xb25" (OuterVolumeSpecName: "kube-api-access-5xb25") pod "d36308a0-1b17-4986-adb2-2833b444a239" (UID: "d36308a0-1b17-4986-adb2-2833b444a239"). InnerVolumeSpecName "kube-api-access-5xb25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.451497 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-scripts" (OuterVolumeSpecName: "scripts") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.452365 4962 scope.go:117] "RemoveContainer" containerID="cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.452569 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.453027 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.457864 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1289d443-56d2-4f63-8802-66bcd0569b3b-logs" (OuterVolumeSpecName: "logs") pod "1289d443-56d2-4f63-8802-66bcd0569b3b" (UID: "1289d443-56d2-4f63-8802-66bcd0569b3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.470353 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7\": container with ID starting with cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7 not found: ID does not exist" containerID="cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.470398 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7"} err="failed to get container status \"cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7\": rpc error: code = NotFound desc = could not find container \"cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7\": container with ID starting with cce61d9b927002bd5d5d741af0d9a03b88958f9b494160698ba0a870465f6ee7 not found: ID does not exist" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.470443 4962 scope.go:117] "RemoveContainer" containerID="975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.470501 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.470533 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.476935 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.477010 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.477301 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.496001 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"85ea0653-966b-47ff-b8aa-b6ad2b5810ca","Type":"ContainerDied","Data":"c9c8ed01d13ca0f8ff902a9439408e51baa90fc268f710fa26c3c09fc4aeec3c"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.496067 4962 generic.go:334] "Generic (PLEG): container finished" podID="85ea0653-966b-47ff-b8aa-b6ad2b5810ca" containerID="c9c8ed01d13ca0f8ff902a9439408e51baa90fc268f710fa26c3c09fc4aeec3c" exitCode=0 Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.496133 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"85ea0653-966b-47ff-b8aa-b6ad2b5810ca","Type":"ContainerDied","Data":"fb0d919568b2eec75f54551c357ef63bb039e62f2156dd76893d99b06463a883"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.496145 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb0d919568b2eec75f54551c357ef63bb039e62f2156dd76893d99b06463a883" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.496252 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05f2e935-e9b5-49ab-8a2a-30b15840bae9" (UID: "05f2e935-e9b5-49ab-8a2a-30b15840bae9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.498538 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.498984 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d329c4da-aa05-4c80-ab30-622eac56428a-logs" (OuterVolumeSpecName: "logs") pod "d329c4da-aa05-4c80-ab30-622eac56428a" (UID: "d329c4da-aa05-4c80-ab30-622eac56428a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.499648 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72df0792-9904-4b64-9c70-37cb982fe24b-kube-api-access-srwdq" (OuterVolumeSpecName: "kube-api-access-srwdq") pod "72df0792-9904-4b64-9c70-37cb982fe24b" (UID: "72df0792-9904-4b64-9c70-37cb982fe24b"). InnerVolumeSpecName "kube-api-access-srwdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.502152 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.503962 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.508028 4962 generic.go:334] "Generic (PLEG): container finished" podID="d22955d6-a957-458f-8181-5fea18cedc90" containerID="0ce8b4a4ac9d0b8cc21e8feaa51be48f8d9e21fc13fb4c98b22efe982cb9565b" exitCode=0 Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.508054 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d22955d6-a957-458f-8181-5fea18cedc90","Type":"ContainerDied","Data":"0ce8b4a4ac9d0b8cc21e8feaa51be48f8d9e21fc13fb4c98b22efe982cb9565b"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.510327 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d22955d6-a957-458f-8181-5fea18cedc90","Type":"ContainerDied","Data":"f72fdcf624171197a72f081f96097a602fb6bd22864a9a806672b4ec38418965"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.510387 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f72fdcf624171197a72f081f96097a602fb6bd22864a9a806672b4ec38418965" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.526742 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz9nn\" (UniqueName: \"kubernetes.io/projected/dd269d6d-5aa2-43c0-a23b-e76b52699d59-kube-api-access-dz9nn\") pod \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.526795 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpplp\" (UniqueName: \"kubernetes.io/projected/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-kube-api-access-lpplp\") pod \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.527614 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-nova-metadata-tls-certs\") pod \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.527981 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-logs\") pod \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.528047 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-scripts\") pod \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.528074 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-combined-ca-bundle\") pod \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.528146 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.528169 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-combined-ca-bundle\") pod \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.528190 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-config-data\") pod \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.528220 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd269d6d-5aa2-43c0-a23b-e76b52699d59-logs\") pod \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.530160 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-internal-tls-certs\") pod \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.530256 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-httpd-run\") pod \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\" (UID: \"cea3d32c-24c3-4a80-a1fb-ad65be7bbba6\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.530311 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-config-data\") pod \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\" (UID: \"dd269d6d-5aa2-43c0-a23b-e76b52699d59\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.530611 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron8f54-account-delete-pznh9" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.531739 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd269d6d-5aa2-43c0-a23b-e76b52699d59-logs" (OuterVolumeSpecName: "logs") pod "dd269d6d-5aa2-43c0-a23b-e76b52699d59" (UID: "dd269d6d-5aa2-43c0-a23b-e76b52699d59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.532120 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-logs" (OuterVolumeSpecName: "logs") pod "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" (UID: "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.536404 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d329c4da-aa05-4c80-ab30-622eac56428a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.551516 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.551562 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srwdq\" (UniqueName: \"kubernetes.io/projected/72df0792-9904-4b64-9c70-37cb982fe24b-kube-api-access-srwdq\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.551579 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xb25\" (UniqueName: \"kubernetes.io/projected/d36308a0-1b17-4986-adb2-2833b444a239-kube-api-access-5xb25\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.551597 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.551612 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d443-56d2-4f63-8802-66bcd0569b3b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.551515 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" (UID: "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.552171 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "05f2e935-e9b5-49ab-8a2a-30b15840bae9" (UID: "05f2e935-e9b5-49ab-8a2a-30b15840bae9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.552197 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1289d443-56d2-4f63-8802-66bcd0569b3b-kube-api-access-pvhjw" (OuterVolumeSpecName: "kube-api-access-pvhjw") pod "1289d443-56d2-4f63-8802-66bcd0569b3b" (UID: "1289d443-56d2-4f63-8802-66bcd0569b3b"). InnerVolumeSpecName "kube-api-access-pvhjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.552688 4962 generic.go:334] "Generic (PLEG): container finished" podID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerID="ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4" exitCode=143 Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.552766 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c97f5c65f-s279k" event={"ID":"0ae87940-f07d-4213-bc0b-da0b3a2bba84","Type":"ContainerDied","Data":"ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.558173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" (UID: "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.558230 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.562443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae29e17-1d99-4401-a317-9c8b7be58a3c-kube-api-access-sstsr" (OuterVolumeSpecName: "kube-api-access-sstsr") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "kube-api-access-sstsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.562686 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-scripts" (OuterVolumeSpecName: "scripts") pod "1289d443-56d2-4f63-8802-66bcd0569b3b" (UID: "1289d443-56d2-4f63-8802-66bcd0569b3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.562684 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.562768 4962 scope.go:117] "RemoveContainer" containerID="d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.563012 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d36308a0-1b17-4986-adb2-2833b444a239" (UID: "d36308a0-1b17-4986-adb2-2833b444a239"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.564318 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-kube-api-access-lpplp" (OuterVolumeSpecName: "kube-api-access-lpplp") pod "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" (UID: "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6"). InnerVolumeSpecName "kube-api-access-lpplp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.565872 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d329c4da-aa05-4c80-ab30-622eac56428a-kube-api-access-xjthg" (OuterVolumeSpecName: "kube-api-access-xjthg") pod "d329c4da-aa05-4c80-ab30-622eac56428a" (UID: "d329c4da-aa05-4c80-ab30-622eac56428a"). InnerVolumeSpecName "kube-api-access-xjthg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.566120 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd269d6d-5aa2-43c0-a23b-e76b52699d59-kube-api-access-dz9nn" (OuterVolumeSpecName: "kube-api-access-dz9nn") pod "dd269d6d-5aa2-43c0-a23b-e76b52699d59" (UID: "dd269d6d-5aa2-43c0-a23b-e76b52699d59"). InnerVolumeSpecName "kube-api-access-dz9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.575082 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-scripts" (OuterVolumeSpecName: "scripts") pod "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" (UID: "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.577143 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance2f72-account-delete-wdbvb" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.581396 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement453e-account-delete-v68p5" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.581457 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09f26ad-247c-477a-9d73-a2a0f8df91e8-kube-api-access-hd26c" (OuterVolumeSpecName: "kube-api-access-hd26c") pod "e09f26ad-247c-477a-9d73-a2a0f8df91e8" (UID: "e09f26ad-247c-477a-9d73-a2a0f8df91e8"). InnerVolumeSpecName "kube-api-access-hd26c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.582989 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.591037 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-68b6c975-4cb8j" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.593605 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron8f54-account-delete-pznh9" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.593881 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron8f54-account-delete-pznh9" event={"ID":"56923e91-36c0-432d-8042-138d2e89eb3b","Type":"ContainerDied","Data":"31fff28172e9a5433374b5fa939f109a51e1fc927c4a63e107621de14afc61fe"} Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.593915 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31fff28172e9a5433374b5fa939f109a51e1fc927c4a63e107621de14afc61fe" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.607903 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4662-account-delete-kl2sp" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.631794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "05f2e935-e9b5-49ab-8a2a-30b15840bae9" (UID: "05f2e935-e9b5-49ab-8a2a-30b15840bae9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.633219 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.633473 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.635977 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.662820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-config-data\") pod \"32e6592a-d206-4931-aa99-a84e041b05e4\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.662881 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv9w7\" (UniqueName: \"kubernetes.io/projected/b42a368b-6dd4-4bb0-83a8-d79138605ec9-kube-api-access-sv9w7\") pod \"b42a368b-6dd4-4bb0-83a8-d79138605ec9\" (UID: \"b42a368b-6dd4-4bb0-83a8-d79138605ec9\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.662913 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-combined-ca-bundle\") pod \"d22955d6-a957-458f-8181-5fea18cedc90\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.662936 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tv7c\" (UniqueName: \"kubernetes.io/projected/32e6592a-d206-4931-aa99-a84e041b05e4-kube-api-access-8tv7c\") pod \"32e6592a-d206-4931-aa99-a84e041b05e4\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663095 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wbh7\" (UniqueName: \"kubernetes.io/projected/b0da1427-1e89-42d6-beb2-55f292945177-kube-api-access-4wbh7\") pod \"b0da1427-1e89-42d6-beb2-55f292945177\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663150 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-config-data\") pod \"b0da1427-1e89-42d6-beb2-55f292945177\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663168 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwbm5\" (UniqueName: \"kubernetes.io/projected/d22955d6-a957-458f-8181-5fea18cedc90-kube-api-access-rwbm5\") pod \"d22955d6-a957-458f-8181-5fea18cedc90\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-public-tls-certs\") pod \"b0da1427-1e89-42d6-beb2-55f292945177\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663242 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npxmv\" (UniqueName: \"kubernetes.io/projected/cfeca1b1-fa87-4490-9e99-38e60d421138-kube-api-access-npxmv\") pod \"cfeca1b1-fa87-4490-9e99-38e60d421138\" (UID: \"cfeca1b1-fa87-4490-9e99-38e60d421138\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663292 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-kolla-config\") pod \"32e6592a-d206-4931-aa99-a84e041b05e4\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663318 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-scripts\") pod \"b0da1427-1e89-42d6-beb2-55f292945177\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663336 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-memcached-tls-certs\") pod \"32e6592a-d206-4931-aa99-a84e041b05e4\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663402 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-combined-ca-bundle\") pod \"32e6592a-d206-4931-aa99-a84e041b05e4\" (UID: \"32e6592a-d206-4931-aa99-a84e041b05e4\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663475 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-httpd-run\") pod \"b0da1427-1e89-42d6-beb2-55f292945177\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663498 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b0da1427-1e89-42d6-beb2-55f292945177\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663493 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-config-data" (OuterVolumeSpecName: "config-data") pod "32e6592a-d206-4931-aa99-a84e041b05e4" (UID: "32e6592a-d206-4931-aa99-a84e041b05e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663519 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-logs\") pod \"b0da1427-1e89-42d6-beb2-55f292945177\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663559 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v8rg\" (UniqueName: \"kubernetes.io/projected/56923e91-36c0-432d-8042-138d2e89eb3b-kube-api-access-6v8rg\") pod \"56923e91-36c0-432d-8042-138d2e89eb3b\" (UID: \"56923e91-36c0-432d-8042-138d2e89eb3b\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-combined-ca-bundle\") pod \"b0da1427-1e89-42d6-beb2-55f292945177\" (UID: \"b0da1427-1e89-42d6-beb2-55f292945177\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.663664 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-config-data\") pod \"d22955d6-a957-458f-8181-5fea18cedc90\" (UID: \"d22955d6-a957-458f-8181-5fea18cedc90\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664132 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjthg\" (UniqueName: \"kubernetes.io/projected/d329c4da-aa05-4c80-ab30-622eac56428a-kube-api-access-xjthg\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664152 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664163 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664174 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664182 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664194 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz9nn\" (UniqueName: \"kubernetes.io/projected/dd269d6d-5aa2-43c0-a23b-e76b52699d59-kube-api-access-dz9nn\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664203 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpplp\" (UniqueName: \"kubernetes.io/projected/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-kube-api-access-lpplp\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664217 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664229 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sstsr\" (UniqueName: \"kubernetes.io/projected/6ae29e17-1d99-4401-a317-9c8b7be58a3c-kube-api-access-sstsr\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664241 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664254 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvhjw\" (UniqueName: \"kubernetes.io/projected/1289d443-56d2-4f63-8802-66bcd0569b3b-kube-api-access-pvhjw\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664264 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664275 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664285 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd26c\" (UniqueName: \"kubernetes.io/projected/e09f26ad-247c-477a-9d73-a2a0f8df91e8-kube-api-access-hd26c\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664304 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664312 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd269d6d-5aa2-43c0-a23b-e76b52699d59-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.664323 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.666413 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b0da1427-1e89-42d6-beb2-55f292945177" (UID: "b0da1427-1e89-42d6-beb2-55f292945177"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.666829 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "32e6592a-d206-4931-aa99-a84e041b05e4" (UID: "32e6592a-d206-4931-aa99-a84e041b05e4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.671849 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-logs" (OuterVolumeSpecName: "logs") pod "b0da1427-1e89-42d6-beb2-55f292945177" (UID: "b0da1427-1e89-42d6-beb2-55f292945177"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.688097 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-scripts" (OuterVolumeSpecName: "scripts") pod "b0da1427-1e89-42d6-beb2-55f292945177" (UID: "b0da1427-1e89-42d6-beb2-55f292945177"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.701711 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e6592a-d206-4931-aa99-a84e041b05e4-kube-api-access-8tv7c" (OuterVolumeSpecName: "kube-api-access-8tv7c") pod "32e6592a-d206-4931-aa99-a84e041b05e4" (UID: "32e6592a-d206-4931-aa99-a84e041b05e4"). InnerVolumeSpecName "kube-api-access-8tv7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.701784 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42a368b-6dd4-4bb0-83a8-d79138605ec9-kube-api-access-sv9w7" (OuterVolumeSpecName: "kube-api-access-sv9w7") pod "b42a368b-6dd4-4bb0-83a8-d79138605ec9" (UID: "b42a368b-6dd4-4bb0-83a8-d79138605ec9"). InnerVolumeSpecName "kube-api-access-sv9w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.701852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22955d6-a957-458f-8181-5fea18cedc90-kube-api-access-rwbm5" (OuterVolumeSpecName: "kube-api-access-rwbm5") pod "d22955d6-a957-458f-8181-5fea18cedc90" (UID: "d22955d6-a957-458f-8181-5fea18cedc90"). InnerVolumeSpecName "kube-api-access-rwbm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.702133 4962 scope.go:117] "RemoveContainer" containerID="975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.702188 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "b0da1427-1e89-42d6-beb2-55f292945177" (UID: "b0da1427-1e89-42d6-beb2-55f292945177"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.702225 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfeca1b1-fa87-4490-9e99-38e60d421138-kube-api-access-npxmv" (OuterVolumeSpecName: "kube-api-access-npxmv") pod "cfeca1b1-fa87-4490-9e99-38e60d421138" (UID: "cfeca1b1-fa87-4490-9e99-38e60d421138"). InnerVolumeSpecName "kube-api-access-npxmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.702606 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72df0792-9904-4b64-9c70-37cb982fe24b" (UID: "72df0792-9904-4b64-9c70-37cb982fe24b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.703101 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f\": container with ID starting with 975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f not found: ID does not exist" containerID="975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.703164 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f"} err="failed to get container status \"975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f\": rpc error: code = NotFound desc = could not find container \"975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f\": container with ID starting with 975c9c39028f01f58f2aea68725568502425600e3e03782630767e28394af41f not found: ID does not exist" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.703209 4962 scope.go:117] "RemoveContainer" containerID="d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.705695 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02\": container with ID starting with d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02 not found: ID does not exist" containerID="d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.705746 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02"} err="failed to get container status \"d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02\": rpc error: code = NotFound desc = could not find container \"d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02\": container with ID starting with d85522767c6e244b2b918d7bc1d422287f82dde9ac20d24a98eadf70a906aa02 not found: ID does not exist" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.727595 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.731895 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0da1427-1e89-42d6-beb2-55f292945177-kube-api-access-4wbh7" (OuterVolumeSpecName: "kube-api-access-4wbh7") pod "b0da1427-1e89-42d6-beb2-55f292945177" (UID: "b0da1427-1e89-42d6-beb2-55f292945177"). InnerVolumeSpecName "kube-api-access-4wbh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.732014 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56923e91-36c0-432d-8042-138d2e89eb3b-kube-api-access-6v8rg" (OuterVolumeSpecName: "kube-api-access-6v8rg") pod "56923e91-36c0-432d-8042-138d2e89eb3b" (UID: "56923e91-36c0-432d-8042-138d2e89eb3b"). InnerVolumeSpecName "kube-api-access-6v8rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.765974 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c111271-43ed-48b3-b6ed-a6d02efb9113-logs\") pod \"3c111271-43ed-48b3-b6ed-a6d02efb9113\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.770534 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-internal-tls-certs\") pod \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.767862 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "72df0792-9904-4b64-9c70-37cb982fe24b" (UID: "72df0792-9904-4b64-9c70-37cb982fe24b"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.768216 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c111271-43ed-48b3-b6ed-a6d02efb9113-logs" (OuterVolumeSpecName: "logs") pod "3c111271-43ed-48b3-b6ed-a6d02efb9113" (UID: "3c111271-43ed-48b3-b6ed-a6d02efb9113"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.771239 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-public-tls-certs\") pod \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.771347 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data-custom\") pod \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.771474 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-combined-ca-bundle\") pod \"3c111271-43ed-48b3-b6ed-a6d02efb9113\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.771662 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-logs\") pod \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.771755 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hplr\" (UniqueName: \"kubernetes.io/projected/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-kube-api-access-4hplr\") pod \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.771830 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-combined-ca-bundle\") pod \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.772035 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data\") pod \"3c111271-43ed-48b3-b6ed-a6d02efb9113\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.772119 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data\") pod \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.772211 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn8zv\" (UniqueName: \"kubernetes.io/projected/3c111271-43ed-48b3-b6ed-a6d02efb9113-kube-api-access-vn8zv\") pod \"3c111271-43ed-48b3-b6ed-a6d02efb9113\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.772304 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvmrv\" (UniqueName: \"kubernetes.io/projected/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-kube-api-access-hvmrv\") pod \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\" (UID: \"c56a1d1d-7e30-4bb8-a5a7-068afc055cb8\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.772397 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-config-data\") pod \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.772675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data-custom\") pod \"3c111271-43ed-48b3-b6ed-a6d02efb9113\" (UID: \"3c111271-43ed-48b3-b6ed-a6d02efb9113\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.772753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-combined-ca-bundle\") pod \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\" (UID: \"85ea0653-966b-47ff-b8aa-b6ad2b5810ca\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.772816 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fb8j\" (UniqueName: \"kubernetes.io/projected/8e098e6f-ec3b-41e6-b179-6c196ad1fe49-kube-api-access-4fb8j\") pod \"8e098e6f-ec3b-41e6-b179-6c196ad1fe49\" (UID: \"8e098e6f-ec3b-41e6-b179-6c196ad1fe49\") " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774304 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv9w7\" (UniqueName: \"kubernetes.io/projected/b42a368b-6dd4-4bb0-83a8-d79138605ec9-kube-api-access-sv9w7\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774331 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tv7c\" (UniqueName: \"kubernetes.io/projected/32e6592a-d206-4931-aa99-a84e041b05e4-kube-api-access-8tv7c\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774346 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774360 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wbh7\" (UniqueName: \"kubernetes.io/projected/b0da1427-1e89-42d6-beb2-55f292945177-kube-api-access-4wbh7\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774373 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwbm5\" (UniqueName: \"kubernetes.io/projected/d22955d6-a957-458f-8181-5fea18cedc90-kube-api-access-rwbm5\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774385 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npxmv\" (UniqueName: \"kubernetes.io/projected/cfeca1b1-fa87-4490-9e99-38e60d421138-kube-api-access-npxmv\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774397 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32e6592a-d206-4931-aa99-a84e041b05e4-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774410 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.774422 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.775940 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.775965 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0da1427-1e89-42d6-beb2-55f292945177-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.775993 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.776008 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v8rg\" (UniqueName: \"kubernetes.io/projected/56923e91-36c0-432d-8042-138d2e89eb3b-kube-api-access-6v8rg\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.777291 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-logs" (OuterVolumeSpecName: "logs") pod "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" (UID: "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.786177 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6sqdm" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerName="ovn-controller" probeResult="failure" output="command timed out" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.791560 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c111271-43ed-48b3-b6ed-a6d02efb9113-kube-api-access-vn8zv" (OuterVolumeSpecName: "kube-api-access-vn8zv") pod "3c111271-43ed-48b3-b6ed-a6d02efb9113" (UID: "3c111271-43ed-48b3-b6ed-a6d02efb9113"). InnerVolumeSpecName "kube-api-access-vn8zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.793204 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-kube-api-access-hvmrv" (OuterVolumeSpecName: "kube-api-access-hvmrv") pod "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" (UID: "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8"). InnerVolumeSpecName "kube-api-access-hvmrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.795580 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd269d6d-5aa2-43c0-a23b-e76b52699d59" (UID: "dd269d6d-5aa2-43c0-a23b-e76b52699d59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.795750 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" (UID: "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.796492 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-kube-api-access-4hplr" (OuterVolumeSpecName: "kube-api-access-4hplr") pod "85ea0653-966b-47ff-b8aa-b6ad2b5810ca" (UID: "85ea0653-966b-47ff-b8aa-b6ad2b5810ca"). InnerVolumeSpecName "kube-api-access-4hplr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.799434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3c111271-43ed-48b3-b6ed-a6d02efb9113" (UID: "3c111271-43ed-48b3-b6ed-a6d02efb9113"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.830152 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e098e6f-ec3b-41e6-b179-6c196ad1fe49-kube-api-access-4fb8j" (OuterVolumeSpecName: "kube-api-access-4fb8j") pod "8e098e6f-ec3b-41e6-b179-6c196ad1fe49" (UID: "8e098e6f-ec3b-41e6-b179-6c196ad1fe49"). InnerVolumeSpecName "kube-api-access-4fb8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.835671 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" (UID: "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.867693 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.870828 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6sqdm" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerName="ovn-controller" probeResult="failure" output=< Oct 03 13:16:36 crc kubenswrapper[4962]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Oct 03 13:16:36 crc kubenswrapper[4962]: > Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.877807 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-config-data" (OuterVolumeSpecName: "config-data") pod "d329c4da-aa05-4c80-ab30-622eac56428a" (UID: "d329c4da-aa05-4c80-ab30-622eac56428a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878236 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878260 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878269 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878281 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fb8j\" (UniqueName: \"kubernetes.io/projected/8e098e6f-ec3b-41e6-b179-6c196ad1fe49-kube-api-access-4fb8j\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878290 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878299 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c111271-43ed-48b3-b6ed-a6d02efb9113-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878309 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878319 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878329 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hplr\" (UniqueName: \"kubernetes.io/projected/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-kube-api-access-4hplr\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878338 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn8zv\" (UniqueName: \"kubernetes.io/projected/3c111271-43ed-48b3-b6ed-a6d02efb9113-kube-api-access-vn8zv\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878348 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvmrv\" (UniqueName: \"kubernetes.io/projected/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-kube-api-access-hvmrv\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878359 4962 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.878370 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.878448 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:36 crc kubenswrapper[4962]: E1003 13:16:36.878484 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data podName:221bdd26-0fec-49e5-86ec-c2aefe7a5902 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:44.878470865 +0000 UTC m=+1613.282368700 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data") pod "rabbitmq-cell1-server-0" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902") : configmap "rabbitmq-cell1-config-data" not found Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.914574 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b0da1427-1e89-42d6-beb2-55f292945177" (UID: "b0da1427-1e89-42d6-beb2-55f292945177"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.915972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-config-data" (OuterVolumeSpecName: "config-data") pod "dd269d6d-5aa2-43c0-a23b-e76b52699d59" (UID: "dd269d6d-5aa2-43c0-a23b-e76b52699d59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.937531 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-config-data" (OuterVolumeSpecName: "config-data") pod "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" (UID: "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.957084 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d329c4da-aa05-4c80-ab30-622eac56428a" (UID: "d329c4da-aa05-4c80-ab30-622eac56428a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.958167 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c111271-43ed-48b3-b6ed-a6d02efb9113" (UID: "3c111271-43ed-48b3-b6ed-a6d02efb9113"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.975321 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" (UID: "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.985000 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.985039 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.985052 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.985065 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.985076 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:36 crc kubenswrapper[4962]: I1003 13:16:36.985086 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.008694 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-config-data" (OuterVolumeSpecName: "config-data") pod "05f2e935-e9b5-49ab-8a2a-30b15840bae9" (UID: "05f2e935-e9b5-49ab-8a2a-30b15840bae9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.030783 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" (UID: "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.036334 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-config-data" (OuterVolumeSpecName: "config-data") pod "85ea0653-966b-47ff-b8aa-b6ad2b5810ca" (UID: "85ea0653-966b-47ff-b8aa-b6ad2b5810ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.049743 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d22955d6-a957-458f-8181-5fea18cedc90" (UID: "d22955d6-a957-458f-8181-5fea18cedc90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.066076 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0da1427-1e89-42d6-beb2-55f292945177" (UID: "b0da1427-1e89-42d6-beb2-55f292945177"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.082669 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32e6592a-d206-4931-aa99-a84e041b05e4" (UID: "32e6592a-d206-4931-aa99-a84e041b05e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.086456 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-config-data" (OuterVolumeSpecName: "config-data") pod "d36308a0-1b17-4986-adb2-2833b444a239" (UID: "d36308a0-1b17-4986-adb2-2833b444a239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.088298 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.088358 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.088374 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.088391 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.088406 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.088423 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f2e935-e9b5-49ab-8a2a-30b15840bae9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.088435 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36308a0-1b17-4986-adb2-2833b444a239-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.088546 4962 configmap.go:193] Couldn't get configMap openstack/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.088677 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts podName:6d6f62dd-0720-46b6-b0a8-497490f052a8 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:45.088607009 +0000 UTC m=+1613.492504834 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts") pod "ovn-controller-6sqdm" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8") : configmap "ovncontroller-scripts" not found Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.095148 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "32e6592a-d206-4931-aa99-a84e041b05e4" (UID: "32e6592a-d206-4931-aa99-a84e041b05e4"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.108716 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d329c4da-aa05-4c80-ab30-622eac56428a" (UID: "d329c4da-aa05-4c80-ab30-622eac56428a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.122559 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.123812 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d329c4da-aa05-4c80-ab30-622eac56428a" (UID: "d329c4da-aa05-4c80-ab30-622eac56428a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.159553 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "72df0792-9904-4b64-9c70-37cb982fe24b" (UID: "72df0792-9904-4b64-9c70-37cb982fe24b"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.166122 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-config-data" (OuterVolumeSpecName: "config-data") pod "1289d443-56d2-4f63-8802-66bcd0569b3b" (UID: "1289d443-56d2-4f63-8802-66bcd0569b3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.169838 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dd269d6d-5aa2-43c0-a23b-e76b52699d59" (UID: "dd269d6d-5aa2-43c0-a23b-e76b52699d59"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.172058 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1289d443-56d2-4f63-8802-66bcd0569b3b" (UID: "1289d443-56d2-4f63-8802-66bcd0569b3b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.187282 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-config-data" (OuterVolumeSpecName: "config-data") pod "d22955d6-a957-458f-8181-5fea18cedc90" (UID: "d22955d6-a957-458f-8181-5fea18cedc90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.187960 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-config-data" (OuterVolumeSpecName: "config-data") pod "b0da1427-1e89-42d6-beb2-55f292945177" (UID: "b0da1427-1e89-42d6-beb2-55f292945177"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189875 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22955d6-a957-458f-8181-5fea18cedc90-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189905 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189916 4962 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72df0792-9904-4b64-9c70-37cb982fe24b-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189925 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd269d6d-5aa2-43c0-a23b-e76b52699d59-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189934 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0da1427-1e89-42d6-beb2-55f292945177-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189941 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189950 4962 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/32e6592a-d206-4931-aa99-a84e041b05e4-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189958 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189967 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d329c4da-aa05-4c80-ab30-622eac56428a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.189975 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.219719 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.219850 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85ea0653-966b-47ff-b8aa-b6ad2b5810ca" (UID: "85ea0653-966b-47ff-b8aa-b6ad2b5810ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.237941 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-58d67d97c8-pnjp8" podUID="a6cba65d-0ae5-4a81-88c1-da4e07d7a803" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.146:5000/v3\": read tcp 10.217.0.2:60774->10.217.0.146:5000: read: connection reset by peer" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.240213 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1289d443-56d2-4f63-8802-66bcd0569b3b" (UID: "1289d443-56d2-4f63-8802-66bcd0569b3b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.248036 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" (UID: "cea3d32c-24c3-4a80-a1fb-ad65be7bbba6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.266351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data" (OuterVolumeSpecName: "config-data") pod "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" (UID: "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.268535 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data" (OuterVolumeSpecName: "config-data") pod "3c111271-43ed-48b3-b6ed-a6d02efb9113" (UID: "3c111271-43ed-48b3-b6ed-a6d02efb9113"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.292053 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.292084 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.292094 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c111271-43ed-48b3-b6ed-a6d02efb9113-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.292104 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.292112 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ea0653-966b-47ff-b8aa-b6ad2b5810ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.292121 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.306323 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data" (OuterVolumeSpecName: "config-data") pod "6ae29e17-1d99-4401-a317-9c8b7be58a3c" (UID: "6ae29e17-1d99-4401-a317-9c8b7be58a3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.317087 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" (UID: "c56a1d1d-7e30-4bb8-a5a7-068afc055cb8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.320979 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1289d443-56d2-4f63-8802-66bcd0569b3b" (UID: "1289d443-56d2-4f63-8802-66bcd0569b3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.400444 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.400596 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d443-56d2-4f63-8802-66bcd0569b3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.400612 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.400622 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae29e17-1d99-4401-a317-9c8b7be58a3c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.400687 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.400778 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:45.400755234 +0000 UTC m=+1613.804653149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-config-data" not found Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.440379 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.456708 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-68b6c975-4cb8j"] Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.468568 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-68b6c975-4cb8j"] Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.474319 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron8f54-account-delete-pznh9"] Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.483053 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron8f54-account-delete-pznh9"] Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501069 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3fb0456-394e-4041-829b-57c162966b2b-config-data-generated\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501115 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-kolla-config\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501167 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwc6r\" (UniqueName: \"kubernetes.io/projected/a3fb0456-394e-4041-829b-57c162966b2b-kube-api-access-dwc6r\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-galera-tls-certs\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501230 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-secrets\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501275 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-operator-scripts\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501297 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-combined-ca-bundle\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501350 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-config-data-default\") pod \"a3fb0456-394e-4041-829b-57c162966b2b\" (UID: \"a3fb0456-394e-4041-829b-57c162966b2b\") " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.501705 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.503377 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.503956 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3fb0456-394e-4041-829b-57c162966b2b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.505229 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.510997 4962 projected.go:194] Error preparing data for projected volume kube-api-access-9bx2f for pod openstack/barbican-api-7c44799d88-mmmm6: failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.511070 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:45.511051466 +0000 UTC m=+1613.914949301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9bx2f" (UniqueName: "kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.516123 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.517883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-secrets" (OuterVolumeSpecName: "secrets") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.522966 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fb0456-394e-4041-829b-57c162966b2b-kube-api-access-dwc6r" (OuterVolumeSpecName: "kube-api-access-dwc6r") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "kube-api-access-dwc6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.526590 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.528703 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.554063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "a3fb0456-394e-4041-829b-57c162966b2b" (UID: "a3fb0456-394e-4041-829b-57c162966b2b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603034 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603062 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603072 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3fb0456-394e-4041-829b-57c162966b2b-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603082 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603091 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwc6r\" (UniqueName: \"kubernetes.io/projected/a3fb0456-394e-4041-829b-57c162966b2b-kube-api-access-dwc6r\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603101 4962 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603109 4962 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603117 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3fb0456-394e-4041-829b-57c162966b2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.603124 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fb0456-394e-4041-829b-57c162966b2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.617606 4962 generic.go:334] "Generic (PLEG): container finished" podID="a6cba65d-0ae5-4a81-88c1-da4e07d7a803" containerID="3108a39b0723e787d2db8b185df6254591ba7ffd8691d08c951e273ac8405e51" exitCode=0 Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.617701 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-58d67d97c8-pnjp8" event={"ID":"a6cba65d-0ae5-4a81-88c1-da4e07d7a803","Type":"ContainerDied","Data":"3108a39b0723e787d2db8b185df6254591ba7ffd8691d08c951e273ac8405e51"} Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.620050 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.624904 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3fb0456-394e-4041-829b-57c162966b2b" containerID="b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac" exitCode=0 Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.625007 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3fb0456-394e-4041-829b-57c162966b2b","Type":"ContainerDied","Data":"b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac"} Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.625040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a3fb0456-394e-4041-829b-57c162966b2b","Type":"ContainerDied","Data":"36d786ae84adb0baff5f0e6b75cba07aa00cfbafa3a40d1b7db2376679095f6c"} Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.625060 4962 scope.go:117] "RemoveContainer" containerID="b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.625185 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.627514 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.627534 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4662-account-delete-kl2sp" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.627604 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.627652 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6456949cf6-r4n9q" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628170 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628218 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628229 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628241 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628271 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628298 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement453e-account-delete-v68p5" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628302 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7fd754f4-rx9k9" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628317 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance2f72-account-delete-wdbvb" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628347 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bd6989694-qnv2s" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628376 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628431 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.628474 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.649528 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0b903-account-delete-dqf57" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.660563 4962 scope.go:117] "RemoveContainer" containerID="7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.708318 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.780224 4962 scope.go:117] "RemoveContainer" containerID="b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac" Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.788123 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac\": container with ID starting with b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac not found: ID does not exist" containerID="b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.788168 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac"} err="failed to get container status \"b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac\": rpc error: code = NotFound desc = could not find container \"b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac\": container with ID starting with b8c45a2afce07209a956e277c7796fab60709250ab0d2e737f207d8293e6abac not found: ID does not exist" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.788196 4962 scope.go:117] "RemoveContainer" containerID="7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e" Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.793603 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e\": container with ID starting with 7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e not found: ID does not exist" containerID="7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e" Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.793703 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e"} err="failed to get container status \"7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e\": rpc error: code = NotFound desc = could not find container \"7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e\": container with ID starting with 7ca803b6428733e9933bbbae21d5493fabe1c8a4711ed2063ab6748e8356642e not found: ID does not exist" Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.815205 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.815284 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data podName:0ae87940-f07d-4213-bc0b-da0b3a2bba84 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:45.815258227 +0000 UTC m=+1614.219156052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data") pod "barbican-worker-c97f5c65f-s279k" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84") : secret "barbican-config-data" not found Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.820918 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:16:37 crc kubenswrapper[4962]: I1003 13:16:37.844416 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.900967 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.906423 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.909377 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 13:16:37 crc kubenswrapper[4962]: E1003 13:16:37.909454 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="ovn-northd" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.011893 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.017277 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.021653 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.022772 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:46.022751921 +0000 UTC m=+1614.426649756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-config-data" not found Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.022871 4962 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.022904 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:46.022892614 +0000 UTC m=+1614.426790449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.040479 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.061153 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.078077 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.094134 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.122282 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-fernet-keys\") pod \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.122357 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-internal-tls-certs\") pod \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.122538 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvkr2\" (UniqueName: \"kubernetes.io/projected/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-kube-api-access-nvkr2\") pod \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.122703 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-combined-ca-bundle\") pod \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.122733 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-scripts\") pod \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.122763 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-credential-keys\") pod \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.123778 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-public-tls-certs\") pod \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.123828 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-config-data\") pod \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\" (UID: \"a6cba65d-0ae5-4a81-88c1-da4e07d7a803\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.134058 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a6cba65d-0ae5-4a81-88c1-da4e07d7a803" (UID: "a6cba65d-0ae5-4a81-88c1-da4e07d7a803"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.134126 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0b903-account-delete-dqf57"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.143912 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-kube-api-access-nvkr2" (OuterVolumeSpecName: "kube-api-access-nvkr2") pod "a6cba65d-0ae5-4a81-88c1-da4e07d7a803" (UID: "a6cba65d-0ae5-4a81-88c1-da4e07d7a803"). InnerVolumeSpecName "kube-api-access-nvkr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.144849 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a6cba65d-0ae5-4a81-88c1-da4e07d7a803" (UID: "a6cba65d-0ae5-4a81-88c1-da4e07d7a803"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.146751 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-scripts" (OuterVolumeSpecName: "scripts") pod "a6cba65d-0ae5-4a81-88c1-da4e07d7a803" (UID: "a6cba65d-0ae5-4a81-88c1-da4e07d7a803"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.147558 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0b903-account-delete-dqf57"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.166242 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.172517 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.176381 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6cba65d-0ae5-4a81-88c1-da4e07d7a803" (UID: "a6cba65d-0ae5-4a81-88c1-da4e07d7a803"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.178772 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.200953 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.209669 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a6cba65d-0ae5-4a81-88c1-da4e07d7a803" (UID: "a6cba65d-0ae5-4a81-88c1-da4e07d7a803"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.214262 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-config-data" (OuterVolumeSpecName: "config-data") pod "a6cba65d-0ae5-4a81-88c1-da4e07d7a803" (UID: "a6cba65d-0ae5-4a81-88c1-da4e07d7a803"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.222449 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.226716 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.226802 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.226856 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvkr2\" (UniqueName: \"kubernetes.io/projected/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-kube-api-access-nvkr2\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.226931 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.227832 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.228248 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.228277 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.255184 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" path="/var/lib/kubelet/pods/05f2e935-e9b5-49ab-8a2a-30b15840bae9/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.256249 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" path="/var/lib/kubelet/pods/2ecb3944-c441-4879-8220-aa32d7436c1f/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.262320 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56923e91-36c0-432d-8042-138d2e89eb3b" path="/var/lib/kubelet/pods/56923e91-36c0-432d-8042-138d2e89eb3b/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.262543 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a6cba65d-0ae5-4a81-88c1-da4e07d7a803" (UID: "a6cba65d-0ae5-4a81-88c1-da4e07d7a803"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.266340 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" path="/var/lib/kubelet/pods/6ae29e17-1d99-4401-a317-9c8b7be58a3c/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.267169 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72df0792-9904-4b64-9c70-37cb982fe24b" path="/var/lib/kubelet/pods/72df0792-9904-4b64-9c70-37cb982fe24b/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.267769 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fb0456-394e-4041-829b-57c162966b2b" path="/var/lib/kubelet/pods/a3fb0456-394e-4041-829b-57c162966b2b/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.295848 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" path="/var/lib/kubelet/pods/cea3d32c-24c3-4a80-a1fb-ad65be7bbba6/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.296371 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22955d6-a957-458f-8181-5fea18cedc90" path="/var/lib/kubelet/pods/d22955d6-a957-458f-8181-5fea18cedc90/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.296807 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09f26ad-247c-477a-9d73-a2a0f8df91e8" path="/var/lib/kubelet/pods/e09f26ad-247c-477a-9d73-a2a0f8df91e8/volumes" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.298402 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.298431 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.299677 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.307202 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6456949cf6-r4n9q"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.323675 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6456949cf6-r4n9q"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.329835 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance2f72-account-delete-wdbvb"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.331419 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6cba65d-0ae5-4a81-88c1-da4e07d7a803-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.345355 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance2f72-account-delete-wdbvb"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.362426 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b7fd754f4-rx9k9"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.362500 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b7fd754f4-rx9k9"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.363312 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement453e-account-delete-v68p5"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.369689 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement453e-account-delete-v68p5"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.377725 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.385015 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.387011 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="85710c21-98fe-4148-8ef1-ec9f4e9ef311" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.194:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.390313 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.392827 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.399988 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.406855 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.411791 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.416542 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.426144 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.432608 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.432707 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data podName:862ad9df-af58-4304-9ad5-7faba334e2d9 nodeName:}" failed. No retries permitted until 2025-10-03 13:16:46.432684032 +0000 UTC m=+1614.836581867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data") pod "rabbitmq-server-0" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9") : configmap "rabbitmq-config-data" not found Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.442741 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4662-account-delete-kl2sp"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.447312 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder4662-account-delete-kl2sp"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.460510 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-bd6989694-qnv2s"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.468982 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-bd6989694-qnv2s"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.533761 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-erlang-cookie\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.533825 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/221bdd26-0fec-49e5-86ec-c2aefe7a5902-erlang-cookie-secret\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.533864 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-plugins\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.533921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.533984 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqtcv\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-kube-api-access-lqtcv\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.534020 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-plugins-conf\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.534165 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-server-conf\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.535319 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.535570 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-tls\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.535602 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/221bdd26-0fec-49e5-86ec-c2aefe7a5902-pod-info\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.535661 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-confd\") pod \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\" (UID: \"221bdd26-0fec-49e5-86ec-c2aefe7a5902\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.535943 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.535996 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.537432 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.537825 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.537849 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.537862 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.540449 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/221bdd26-0fec-49e5-86ec-c2aefe7a5902-pod-info" (OuterVolumeSpecName: "pod-info") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.542687 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.550870 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-kube-api-access-lqtcv" (OuterVolumeSpecName: "kube-api-access-lqtcv") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "kube-api-access-lqtcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.550981 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.560471 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/221bdd26-0fec-49e5-86ec-c2aefe7a5902-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.577852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data" (OuterVolumeSpecName: "config-data") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.596896 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-server-conf" (OuterVolumeSpecName: "server-conf") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.639498 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.639535 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/221bdd26-0fec-49e5-86ec-c2aefe7a5902-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.639549 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/221bdd26-0fec-49e5-86ec-c2aefe7a5902-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.639574 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.639630 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqtcv\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-kube-api-access-lqtcv\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.639658 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.639667 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/221bdd26-0fec-49e5-86ec-c2aefe7a5902-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.641768 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5c876ef6-c8ab-44d1-9ba4-07f0b5e07695/ovn-northd/0.log" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.641809 4962 generic.go:334] "Generic (PLEG): container finished" podID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerID="d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" exitCode=139 Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.641869 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695","Type":"ContainerDied","Data":"d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005"} Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.645244 4962 generic.go:334] "Generic (PLEG): container finished" podID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerID="ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493" exitCode=0 Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.645286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"221bdd26-0fec-49e5-86ec-c2aefe7a5902","Type":"ContainerDied","Data":"ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493"} Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.645303 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"221bdd26-0fec-49e5-86ec-c2aefe7a5902","Type":"ContainerDied","Data":"8cd05986b0cc552f2360426afb7b5dcd08e94c8b1adb6a4e2542cae4b84a372b"} Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.645319 4962 scope.go:117] "RemoveContainer" containerID="ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.645431 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.645830 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "221bdd26-0fec-49e5-86ec-c2aefe7a5902" (UID: "221bdd26-0fec-49e5-86ec-c2aefe7a5902"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.650445 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-58d67d97c8-pnjp8" event={"ID":"a6cba65d-0ae5-4a81-88c1-da4e07d7a803","Type":"ContainerDied","Data":"840fbb208720d4586b8c6a97e5617af496deb903b6b103ec0a9614b97aeeaeb9"} Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.650471 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-58d67d97c8-pnjp8" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.652386 4962 generic.go:334] "Generic (PLEG): container finished" podID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerID="ffd50edd39ffcd28008bcc1779cfab62afd89dc665b9322ffc084495ca1c56d2" exitCode=0 Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.652432 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"862ad9df-af58-4304-9ad5-7faba334e2d9","Type":"ContainerDied","Data":"ffd50edd39ffcd28008bcc1779cfab62afd89dc665b9322ffc084495ca1c56d2"} Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.675446 4962 scope.go:117] "RemoveContainer" containerID="d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.691599 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-58d67d97c8-pnjp8"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.696084 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-58d67d97c8-pnjp8"] Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.697383 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.699300 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.700430 4962 scope.go:117] "RemoveContainer" containerID="ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493" Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.700691 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493\": container with ID starting with ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493 not found: ID does not exist" containerID="ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.700717 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493"} err="failed to get container status \"ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493\": rpc error: code = NotFound desc = could not find container \"ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493\": container with ID starting with ea53a4ccfd30918132162b300d37c963556517040abcef111b31c53d24dd2493 not found: ID does not exist" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.700734 4962 scope.go:117] "RemoveContainer" containerID="d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b" Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.700995 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b\": container with ID starting with d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b not found: ID does not exist" containerID="d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.701040 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b"} err="failed to get container status \"d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b\": rpc error: code = NotFound desc = could not find container \"d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b\": container with ID starting with d83453301b65612ecfcc0cbeb8e61c9a2152a509e6b989048522d0b4d9e6955b not found: ID does not exist" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.701067 4962 scope.go:117] "RemoveContainer" containerID="3108a39b0723e787d2db8b185df6254591ba7ffd8691d08c951e273ac8405e51" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.753714 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/221bdd26-0fec-49e5-86ec-c2aefe7a5902-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.753739 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.811447 4962 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 03 13:16:38 crc kubenswrapper[4962]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-03T13:16:31Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 03 13:16:38 crc kubenswrapper[4962]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Oct 03 13:16:38 crc kubenswrapper[4962]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-6sqdm" message=< Oct 03 13:16:38 crc kubenswrapper[4962]: Exiting ovn-controller (1) [FAILED] Oct 03 13:16:38 crc kubenswrapper[4962]: Killing ovn-controller (1) [ OK ] Oct 03 13:16:38 crc kubenswrapper[4962]: Killing ovn-controller (1) with SIGKILL [ OK ] Oct 03 13:16:38 crc kubenswrapper[4962]: 2025-10-03T13:16:31Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 03 13:16:38 crc kubenswrapper[4962]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Oct 03 13:16:38 crc kubenswrapper[4962]: > Oct 03 13:16:38 crc kubenswrapper[4962]: E1003 13:16:38.811488 4962 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 03 13:16:38 crc kubenswrapper[4962]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-03T13:16:31Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 03 13:16:38 crc kubenswrapper[4962]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Oct 03 13:16:38 crc kubenswrapper[4962]: > pod="openstack/ovn-controller-6sqdm" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerName="ovn-controller" containerID="cri-o://16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.811523 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-6sqdm" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerName="ovn-controller" containerID="cri-o://16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12" gracePeriod=22 Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-erlang-cookie\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855239 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-plugins\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855320 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/862ad9df-af58-4304-9ad5-7faba334e2d9-erlang-cookie-secret\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855365 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-server-conf\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/862ad9df-af58-4304-9ad5-7faba334e2d9-pod-info\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855575 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855670 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-plugins-conf\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855775 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb9gx\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-kube-api-access-wb9gx\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855819 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-confd\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855843 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-tls\") pod \"862ad9df-af58-4304-9ad5-7faba334e2d9\" (UID: \"862ad9df-af58-4304-9ad5-7faba334e2d9\") " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.855836 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.856296 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.857195 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.857463 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.860833 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.861021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862ad9df-af58-4304-9ad5-7faba334e2d9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.862116 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-kube-api-access-wb9gx" (OuterVolumeSpecName: "kube-api-access-wb9gx") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "kube-api-access-wb9gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.863005 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.863249 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/862ad9df-af58-4304-9ad5-7faba334e2d9-pod-info" (OuterVolumeSpecName: "pod-info") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.878240 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data" (OuterVolumeSpecName: "config-data") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.897965 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-server-conf" (OuterVolumeSpecName: "server-conf") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958346 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb9gx\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-kube-api-access-wb9gx\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958371 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958385 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958399 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/862ad9df-af58-4304-9ad5-7faba334e2d9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958410 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958420 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/862ad9df-af58-4304-9ad5-7faba334e2d9-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958430 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958453 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.958465 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/862ad9df-af58-4304-9ad5-7faba334e2d9-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.960813 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "862ad9df-af58-4304-9ad5-7faba334e2d9" (UID: "862ad9df-af58-4304-9ad5-7faba334e2d9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:38 crc kubenswrapper[4962]: I1003 13:16:38.977970 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.059710 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/862ad9df-af58-4304-9ad5-7faba334e2d9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.059750 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.079050 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.093610 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.174189 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6sqdm_6d6f62dd-0720-46b6-b0a8-497490f052a8/ovn-controller/0.log" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.174259 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.262928 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-combined-ca-bundle\") pod \"6d6f62dd-0720-46b6-b0a8-497490f052a8\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.262975 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw7hs\" (UniqueName: \"kubernetes.io/projected/6d6f62dd-0720-46b6-b0a8-497490f052a8-kube-api-access-nw7hs\") pod \"6d6f62dd-0720-46b6-b0a8-497490f052a8\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.262992 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-log-ovn\") pod \"6d6f62dd-0720-46b6-b0a8-497490f052a8\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.263016 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts\") pod \"6d6f62dd-0720-46b6-b0a8-497490f052a8\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.263039 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run-ovn\") pod \"6d6f62dd-0720-46b6-b0a8-497490f052a8\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.263130 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run\") pod \"6d6f62dd-0720-46b6-b0a8-497490f052a8\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.263172 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-ovn-controller-tls-certs\") pod \"6d6f62dd-0720-46b6-b0a8-497490f052a8\" (UID: \"6d6f62dd-0720-46b6-b0a8-497490f052a8\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.263612 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6d6f62dd-0720-46b6-b0a8-497490f052a8" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.264080 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run" (OuterVolumeSpecName: "var-run") pod "6d6f62dd-0720-46b6-b0a8-497490f052a8" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.264156 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6d6f62dd-0720-46b6-b0a8-497490f052a8" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.264794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts" (OuterVolumeSpecName: "scripts") pod "6d6f62dd-0720-46b6-b0a8-497490f052a8" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.266850 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6f62dd-0720-46b6-b0a8-497490f052a8-kube-api-access-nw7hs" (OuterVolumeSpecName: "kube-api-access-nw7hs") pod "6d6f62dd-0720-46b6-b0a8-497490f052a8" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8"). InnerVolumeSpecName "kube-api-access-nw7hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.283437 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5c876ef6-c8ab-44d1-9ba4-07f0b5e07695/ovn-northd/0.log" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.283522 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.287030 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d6f62dd-0720-46b6-b0a8-497490f052a8" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.325199 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "6d6f62dd-0720-46b6-b0a8-497490f052a8" (UID: "6d6f62dd-0720-46b6-b0a8-497490f052a8"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.364896 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-scripts\") pod \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.365505 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-scripts" (OuterVolumeSpecName: "scripts") pod "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" (UID: "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.365562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92txr\" (UniqueName: \"kubernetes.io/projected/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-kube-api-access-92txr\") pod \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.365580 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-rundir\") pod \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.365643 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-combined-ca-bundle\") pod \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.366211 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" (UID: "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.366387 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-metrics-certs-tls-certs\") pod \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.366433 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-northd-tls-certs\") pod \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.366476 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-config\") pod \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\" (UID: \"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695\") " Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.366966 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-config" (OuterVolumeSpecName: "config") pod "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" (UID: "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368873 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw7hs\" (UniqueName: \"kubernetes.io/projected/6d6f62dd-0720-46b6-b0a8-497490f052a8-kube-api-access-nw7hs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368906 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368919 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d6f62dd-0720-46b6-b0a8-497490f052a8-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368930 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368941 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368951 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368962 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d6f62dd-0720-46b6-b0a8-497490f052a8-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368973 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368985 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.368996 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6f62dd-0720-46b6-b0a8-497490f052a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.369818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-kube-api-access-92txr" (OuterVolumeSpecName: "kube-api-access-92txr") pod "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" (UID: "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695"). InnerVolumeSpecName "kube-api-access-92txr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.388628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" (UID: "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.425146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" (UID: "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.427306 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" (UID: "5c876ef6-c8ab-44d1-9ba4-07f0b5e07695"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.480553 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.480598 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.480612 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.480623 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92txr\" (UniqueName: \"kubernetes.io/projected/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695-kube-api-access-92txr\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.662052 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5c876ef6-c8ab-44d1-9ba4-07f0b5e07695/ovn-northd/0.log" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.662518 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.662585 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5c876ef6-c8ab-44d1-9ba4-07f0b5e07695","Type":"ContainerDied","Data":"d084860c719d4d830c5f8eaa5e004185d65925304682ec6cc942fe2e264fc4f9"} Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.662783 4962 scope.go:117] "RemoveContainer" containerID="c10da2ac06df5b8b854f495ec36dfbffd1281d8e886e7d01348cf8b99da08700" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.669904 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"862ad9df-af58-4304-9ad5-7faba334e2d9","Type":"ContainerDied","Data":"3e73ccf3edd6cc4ebb1304265e6f6697b7c09c7e42268054a0ce04cc42476b09"} Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.669924 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.671315 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6sqdm_6d6f62dd-0720-46b6-b0a8-497490f052a8/ovn-controller/0.log" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.671363 4962 generic.go:334] "Generic (PLEG): container finished" podID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerID="16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12" exitCode=137 Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.671392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm" event={"ID":"6d6f62dd-0720-46b6-b0a8-497490f052a8","Type":"ContainerDied","Data":"16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12"} Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.671414 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6sqdm" event={"ID":"6d6f62dd-0720-46b6-b0a8-497490f052a8","Type":"ContainerDied","Data":"2323181f6574e10b42ffdba92dc7bb0bae9b27ba783b4d5f5147922b762c9b9b"} Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.671457 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6sqdm" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.698142 4962 scope.go:117] "RemoveContainer" containerID="d519de371641e2951bd9f81ed67c53fa2f69a9d44a2a9b5275e2a6772663e005" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.699558 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.711569 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.722108 4962 scope.go:117] "RemoveContainer" containerID="ffd50edd39ffcd28008bcc1779cfab62afd89dc665b9322ffc084495ca1c56d2" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.726318 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.734346 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.738833 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6sqdm"] Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.743844 4962 scope.go:117] "RemoveContainer" containerID="db6803eb436ec4541ae5c4083b22b4f72e755e3175f11d228ed92e5f3fa9bc0c" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.744154 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6sqdm"] Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.760652 4962 scope.go:117] "RemoveContainer" containerID="16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.785783 4962 scope.go:117] "RemoveContainer" containerID="16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12" Oct 03 13:16:39 crc kubenswrapper[4962]: E1003 13:16:39.786422 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12\": container with ID starting with 16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12 not found: ID does not exist" containerID="16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12" Oct 03 13:16:39 crc kubenswrapper[4962]: I1003 13:16:39.786455 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12"} err="failed to get container status \"16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12\": rpc error: code = NotFound desc = could not find container \"16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12\": container with ID starting with 16fa61add98dec43a09d289dd22332e5367c8fb9a1453e95321b160b326d1d12 not found: ID does not exist" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.240208 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" path="/var/lib/kubelet/pods/1289d443-56d2-4f63-8802-66bcd0569b3b/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.241337 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" path="/var/lib/kubelet/pods/221bdd26-0fec-49e5-86ec-c2aefe7a5902/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.242177 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e6592a-d206-4931-aa99-a84e041b05e4" path="/var/lib/kubelet/pods/32e6592a-d206-4931-aa99-a84e041b05e4/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.243483 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" path="/var/lib/kubelet/pods/3c111271-43ed-48b3-b6ed-a6d02efb9113/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.244363 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" path="/var/lib/kubelet/pods/5c876ef6-c8ab-44d1-9ba4-07f0b5e07695/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.245994 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" path="/var/lib/kubelet/pods/6d6f62dd-0720-46b6-b0a8-497490f052a8/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.246708 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ea0653-966b-47ff-b8aa-b6ad2b5810ca" path="/var/lib/kubelet/pods/85ea0653-966b-47ff-b8aa-b6ad2b5810ca/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.247575 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" path="/var/lib/kubelet/pods/862ad9df-af58-4304-9ad5-7faba334e2d9/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.248938 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e098e6f-ec3b-41e6-b179-6c196ad1fe49" path="/var/lib/kubelet/pods/8e098e6f-ec3b-41e6-b179-6c196ad1fe49/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.249553 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6cba65d-0ae5-4a81-88c1-da4e07d7a803" path="/var/lib/kubelet/pods/a6cba65d-0ae5-4a81-88c1-da4e07d7a803/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.250458 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0da1427-1e89-42d6-beb2-55f292945177" path="/var/lib/kubelet/pods/b0da1427-1e89-42d6-beb2-55f292945177/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.251901 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b42a368b-6dd4-4bb0-83a8-d79138605ec9" path="/var/lib/kubelet/pods/b42a368b-6dd4-4bb0-83a8-d79138605ec9/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.252722 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" path="/var/lib/kubelet/pods/c56a1d1d-7e30-4bb8-a5a7-068afc055cb8/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.253407 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfeca1b1-fa87-4490-9e99-38e60d421138" path="/var/lib/kubelet/pods/cfeca1b1-fa87-4490-9e99-38e60d421138/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.254743 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" path="/var/lib/kubelet/pods/d329c4da-aa05-4c80-ab30-622eac56428a/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.255549 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36308a0-1b17-4986-adb2-2833b444a239" path="/var/lib/kubelet/pods/d36308a0-1b17-4986-adb2-2833b444a239/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.256605 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" path="/var/lib/kubelet/pods/dd269d6d-5aa2-43c0-a23b-e76b52699d59/volumes" Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.691940 4962 generic.go:334] "Generic (PLEG): container finished" podID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerID="0daf5bb19acb882ee4245cd2958c71e3acf61abcdee9f27764c1a937ef9e54d3" exitCode=0 Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.691993 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerDied","Data":"0daf5bb19acb882ee4245cd2958c71e3acf61abcdee9f27764c1a937ef9e54d3"} Oct 03 13:16:40 crc kubenswrapper[4962]: I1003 13:16:40.859679 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.113413 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="32e6592a-d206-4931-aa99-a84e041b05e4" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.101:11211: i/o timeout" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.151152 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.313450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22hzv\" (UniqueName: \"kubernetes.io/projected/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-kube-api-access-22hzv\") pod \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.313516 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-combined-ca-bundle\") pod \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.313550 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-scripts\") pod \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.313575 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-log-httpd\") pod \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.314320 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-run-httpd\") pod \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.314354 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-config-data\") pod \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.314376 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-ceilometer-tls-certs\") pod \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.314405 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-sg-core-conf-yaml\") pod \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\" (UID: \"15ad9c69-05d8-4b75-82cc-f23f6303d7d7\") " Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.314526 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "15ad9c69-05d8-4b75-82cc-f23f6303d7d7" (UID: "15ad9c69-05d8-4b75-82cc-f23f6303d7d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.314615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "15ad9c69-05d8-4b75-82cc-f23f6303d7d7" (UID: "15ad9c69-05d8-4b75-82cc-f23f6303d7d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.315089 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.315114 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.319278 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-scripts" (OuterVolumeSpecName: "scripts") pod "15ad9c69-05d8-4b75-82cc-f23f6303d7d7" (UID: "15ad9c69-05d8-4b75-82cc-f23f6303d7d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.324789 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-kube-api-access-22hzv" (OuterVolumeSpecName: "kube-api-access-22hzv") pod "15ad9c69-05d8-4b75-82cc-f23f6303d7d7" (UID: "15ad9c69-05d8-4b75-82cc-f23f6303d7d7"). InnerVolumeSpecName "kube-api-access-22hzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.336934 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "15ad9c69-05d8-4b75-82cc-f23f6303d7d7" (UID: "15ad9c69-05d8-4b75-82cc-f23f6303d7d7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.355815 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "15ad9c69-05d8-4b75-82cc-f23f6303d7d7" (UID: "15ad9c69-05d8-4b75-82cc-f23f6303d7d7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.372213 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15ad9c69-05d8-4b75-82cc-f23f6303d7d7" (UID: "15ad9c69-05d8-4b75-82cc-f23f6303d7d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:41 crc kubenswrapper[4962]: E1003 13:16:41.378685 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:41 crc kubenswrapper[4962]: E1003 13:16:41.379100 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:41 crc kubenswrapper[4962]: E1003 13:16:41.379836 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:41 crc kubenswrapper[4962]: E1003 13:16:41.379874 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:16:41 crc kubenswrapper[4962]: E1003 13:16:41.381839 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:41 crc kubenswrapper[4962]: E1003 13:16:41.383365 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:41 crc kubenswrapper[4962]: E1003 13:16:41.387107 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:41 crc kubenswrapper[4962]: E1003 13:16:41.387165 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.405915 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-config-data" (OuterVolumeSpecName: "config-data") pod "15ad9c69-05d8-4b75-82cc-f23f6303d7d7" (UID: "15ad9c69-05d8-4b75-82cc-f23f6303d7d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.416837 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22hzv\" (UniqueName: \"kubernetes.io/projected/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-kube-api-access-22hzv\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.417040 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.417098 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.417156 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.417208 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.417263 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15ad9c69-05d8-4b75-82cc-f23f6303d7d7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.706591 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15ad9c69-05d8-4b75-82cc-f23f6303d7d7","Type":"ContainerDied","Data":"c94483489b7c42395b9d1b98c978e40f89fa9feba64d23ec0422a827060d8dac"} Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.706700 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.706742 4962 scope.go:117] "RemoveContainer" containerID="cb12fdcf72cb818439ed6c57d7c01f490985bcb4351a8ce3800533ddd1e0259f" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.748955 4962 scope.go:117] "RemoveContainer" containerID="f257bc79eebd262ca3fa0048575136a5f530237c8f848c61fcbe3df34711993b" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.757691 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.764342 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.770559 4962 scope.go:117] "RemoveContainer" containerID="0daf5bb19acb882ee4245cd2958c71e3acf61abcdee9f27764c1a937ef9e54d3" Oct 03 13:16:41 crc kubenswrapper[4962]: I1003 13:16:41.793335 4962 scope.go:117] "RemoveContainer" containerID="5a1f2f720e928bb6b6acb1c0a85af470d0667de88a2c7df54ede75a00de60204" Oct 03 13:16:42 crc kubenswrapper[4962]: I1003 13:16:42.233216 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:16:42 crc kubenswrapper[4962]: E1003 13:16:42.233516 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:16:42 crc kubenswrapper[4962]: I1003 13:16:42.238272 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" path="/var/lib/kubelet/pods/15ad9c69-05d8-4b75-82cc-f23f6303d7d7/volumes" Oct 03 13:16:43 crc kubenswrapper[4962]: I1003 13:16:43.295023 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: i/o timeout" Oct 03 13:16:45 crc kubenswrapper[4962]: I1003 13:16:45.487797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:45 crc kubenswrapper[4962]: E1003 13:16:45.488490 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:45 crc kubenswrapper[4962]: E1003 13:16:45.488598 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:01.488569043 +0000 UTC m=+1629.892466938 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-config-data" not found Oct 03 13:16:45 crc kubenswrapper[4962]: I1003 13:16:45.590018 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:16:45 crc kubenswrapper[4962]: E1003 13:16:45.593484 4962 projected.go:194] Error preparing data for projected volume kube-api-access-9bx2f for pod openstack/barbican-api-7c44799d88-mmmm6: failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:45 crc kubenswrapper[4962]: E1003 13:16:45.593550 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:01.593535262 +0000 UTC m=+1629.997433097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9bx2f" (UniqueName: "kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:16:45 crc kubenswrapper[4962]: E1003 13:16:45.898482 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:45 crc kubenswrapper[4962]: E1003 13:16:45.898561 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data podName:0ae87940-f07d-4213-bc0b-da0b3a2bba84 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:01.898545495 +0000 UTC m=+1630.302443330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data") pod "barbican-worker-c97f5c65f-s279k" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84") : secret "barbican-config-data" not found Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.101762 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.101803 4962 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.101860 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:02.101839005 +0000 UTC m=+1630.505736850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-config-data" not found Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.101885 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:02.101874356 +0000 UTC m=+1630.505772201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-keystone-listener-config-data" not found Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.377713 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.378332 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.379147 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.379197 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.379868 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.382628 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.386542 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:46 crc kubenswrapper[4962]: E1003 13:16:46.386622 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.660936 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.769460 4962 generic.go:334] "Generic (PLEG): container finished" podID="40dc7e17-4436-4452-a266-65d57a67779d" containerID="262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c" exitCode=0 Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.769521 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f745c6cff-9rkw7" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.769541 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f745c6cff-9rkw7" event={"ID":"40dc7e17-4436-4452-a266-65d57a67779d","Type":"ContainerDied","Data":"262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c"} Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.769888 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f745c6cff-9rkw7" event={"ID":"40dc7e17-4436-4452-a266-65d57a67779d","Type":"ContainerDied","Data":"0b998c5e68a1ddb1aa3aa627a98d80ddd737d19e65fa0642ea22a117e443ffec"} Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.769908 4962 scope.go:117] "RemoveContainer" containerID="5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.789277 4962 scope.go:117] "RemoveContainer" containerID="262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.807592 4962 scope.go:117] "RemoveContainer" containerID="5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff" Oct 03 13:16:47 crc kubenswrapper[4962]: E1003 13:16:47.808171 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff\": container with ID starting with 5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff not found: ID does not exist" containerID="5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.808208 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff"} err="failed to get container status \"5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff\": rpc error: code = NotFound desc = could not find container \"5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff\": container with ID starting with 5d3d1dc44ccbb08890a3ce1b240bf10ed759ba813d70174df4fde7b16fbc8eff not found: ID does not exist" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.808234 4962 scope.go:117] "RemoveContainer" containerID="262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c" Oct 03 13:16:47 crc kubenswrapper[4962]: E1003 13:16:47.808600 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c\": container with ID starting with 262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c not found: ID does not exist" containerID="262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.808625 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c"} err="failed to get container status \"262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c\": rpc error: code = NotFound desc = could not find container \"262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c\": container with ID starting with 262e24f9113e8184b611ad4bd820a4085b8b793192569d8c58e7e70a54d8433c not found: ID does not exist" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.828757 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-internal-tls-certs\") pod \"40dc7e17-4436-4452-a266-65d57a67779d\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.828822 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-config\") pod \"40dc7e17-4436-4452-a266-65d57a67779d\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.828862 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm8xw\" (UniqueName: \"kubernetes.io/projected/40dc7e17-4436-4452-a266-65d57a67779d-kube-api-access-dm8xw\") pod \"40dc7e17-4436-4452-a266-65d57a67779d\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.828899 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-public-tls-certs\") pod \"40dc7e17-4436-4452-a266-65d57a67779d\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.829813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-combined-ca-bundle\") pod \"40dc7e17-4436-4452-a266-65d57a67779d\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.829854 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-httpd-config\") pod \"40dc7e17-4436-4452-a266-65d57a67779d\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.829972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-ovndb-tls-certs\") pod \"40dc7e17-4436-4452-a266-65d57a67779d\" (UID: \"40dc7e17-4436-4452-a266-65d57a67779d\") " Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.841792 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "40dc7e17-4436-4452-a266-65d57a67779d" (UID: "40dc7e17-4436-4452-a266-65d57a67779d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.842813 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40dc7e17-4436-4452-a266-65d57a67779d-kube-api-access-dm8xw" (OuterVolumeSpecName: "kube-api-access-dm8xw") pod "40dc7e17-4436-4452-a266-65d57a67779d" (UID: "40dc7e17-4436-4452-a266-65d57a67779d"). InnerVolumeSpecName "kube-api-access-dm8xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.865216 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "40dc7e17-4436-4452-a266-65d57a67779d" (UID: "40dc7e17-4436-4452-a266-65d57a67779d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.870954 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40dc7e17-4436-4452-a266-65d57a67779d" (UID: "40dc7e17-4436-4452-a266-65d57a67779d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.874306 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-config" (OuterVolumeSpecName: "config") pod "40dc7e17-4436-4452-a266-65d57a67779d" (UID: "40dc7e17-4436-4452-a266-65d57a67779d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.893160 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "40dc7e17-4436-4452-a266-65d57a67779d" (UID: "40dc7e17-4436-4452-a266-65d57a67779d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.897578 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "40dc7e17-4436-4452-a266-65d57a67779d" (UID: "40dc7e17-4436-4452-a266-65d57a67779d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.931579 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.931616 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.931628 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.931642 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm8xw\" (UniqueName: \"kubernetes.io/projected/40dc7e17-4436-4452-a266-65d57a67779d-kube-api-access-dm8xw\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.931671 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.931681 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:47 crc kubenswrapper[4962]: I1003 13:16:47.931692 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40dc7e17-4436-4452-a266-65d57a67779d-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 13:16:48 crc kubenswrapper[4962]: I1003 13:16:48.100170 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f745c6cff-9rkw7"] Oct 03 13:16:48 crc kubenswrapper[4962]: I1003 13:16:48.105136 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5f745c6cff-9rkw7"] Oct 03 13:16:48 crc kubenswrapper[4962]: I1003 13:16:48.237011 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40dc7e17-4436-4452-a266-65d57a67779d" path="/var/lib/kubelet/pods/40dc7e17-4436-4452-a266-65d57a67779d/volumes" Oct 03 13:16:51 crc kubenswrapper[4962]: E1003 13:16:51.377483 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:51 crc kubenswrapper[4962]: E1003 13:16:51.378907 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:51 crc kubenswrapper[4962]: E1003 13:16:51.380071 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:51 crc kubenswrapper[4962]: E1003 13:16:51.380081 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:51 crc kubenswrapper[4962]: E1003 13:16:51.380144 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:16:51 crc kubenswrapper[4962]: E1003 13:16:51.382270 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:51 crc kubenswrapper[4962]: E1003 13:16:51.384535 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:51 crc kubenswrapper[4962]: E1003 13:16:51.384613 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:16:55 crc kubenswrapper[4962]: I1003 13:16:55.228891 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:16:55 crc kubenswrapper[4962]: E1003 13:16:55.230276 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:16:56 crc kubenswrapper[4962]: E1003 13:16:56.376796 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:56 crc kubenswrapper[4962]: E1003 13:16:56.377947 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:56 crc kubenswrapper[4962]: E1003 13:16:56.378269 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:16:56 crc kubenswrapper[4962]: E1003 13:16:56.378350 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:16:56 crc kubenswrapper[4962]: E1003 13:16:56.378872 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:56 crc kubenswrapper[4962]: E1003 13:16:56.380758 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:56 crc kubenswrapper[4962]: E1003 13:16:56.382616 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:16:56 crc kubenswrapper[4962]: E1003 13:16:56.382672 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:17:00 crc kubenswrapper[4962]: I1003 13:17:00.906868 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"972fd0b549604163530a4df17ba0265931587abd268311d752380aa374952bb0"} Oct 03 13:17:00 crc kubenswrapper[4962]: I1003 13:17:00.906870 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerID="972fd0b549604163530a4df17ba0265931587abd268311d752380aa374952bb0" exitCode=137 Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.348681 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.385233 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.385386 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac is running failed: container process not found" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.387694 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.387748 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac is running failed: container process not found" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.387918 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.387952 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.388014 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac is running failed: container process not found" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.388053 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wvjpm" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.455035 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") pod \"b4b582ce-b618-4911-b554-f5cae9bcee91\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.455120 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b4b582ce-b618-4911-b554-f5cae9bcee91\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.455151 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-cache\") pod \"b4b582ce-b618-4911-b554-f5cae9bcee91\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.455278 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh65s\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-kube-api-access-vh65s\") pod \"b4b582ce-b618-4911-b554-f5cae9bcee91\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.455316 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-lock\") pod \"b4b582ce-b618-4911-b554-f5cae9bcee91\" (UID: \"b4b582ce-b618-4911-b554-f5cae9bcee91\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.456204 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-lock" (OuterVolumeSpecName: "lock") pod "b4b582ce-b618-4911-b554-f5cae9bcee91" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.456841 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-cache" (OuterVolumeSpecName: "cache") pod "b4b582ce-b618-4911-b554-f5cae9bcee91" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.460827 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b4b582ce-b618-4911-b554-f5cae9bcee91" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.461847 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "swift") pod "b4b582ce-b618-4911-b554-f5cae9bcee91" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.462696 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-kube-api-access-vh65s" (OuterVolumeSpecName: "kube-api-access-vh65s") pod "b4b582ce-b618-4911-b554-f5cae9bcee91" (UID: "b4b582ce-b618-4911-b554-f5cae9bcee91"). InnerVolumeSpecName "kube-api-access-vh65s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.556951 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.557111 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh65s\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-kube-api-access-vh65s\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.557122 4962 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-lock\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.557131 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b4b582ce-b618-4911-b554-f5cae9bcee91-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.557152 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.557160 4962 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b4b582ce-b618-4911-b554-f5cae9bcee91-cache\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.557267 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.557392 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:33.557360614 +0000 UTC m=+1661.961258489 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : secret "barbican-config-data" not found Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.573343 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.659914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") pod \"barbican-api-7c44799d88-mmmm6\" (UID: \"dab0e7ec-9c64-491d-a655-027098042378\") " pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.660080 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.663929 4962 projected.go:194] Error preparing data for projected volume kube-api-access-9bx2f for pod openstack/barbican-api-7c44799d88-mmmm6: failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.664158 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f podName:dab0e7ec-9c64-491d-a655-027098042378 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:33.664140376 +0000 UTC m=+1662.068038211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9bx2f" (UniqueName: "kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f") pod "barbican-api-7c44799d88-mmmm6" (UID: "dab0e7ec-9c64-491d-a655-027098042378") : failed to fetch token: serviceaccounts "barbican-barbican" not found Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.692046 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvjpm_7cb4dab0-1ffc-49d4-a229-1862a33d4caa/ovs-vswitchd/0.log" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.693016 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.862961 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-run\") pod \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863027 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-lib\") pod \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863115 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmq2z\" (UniqueName: \"kubernetes.io/projected/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-kube-api-access-jmq2z\") pod \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863104 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-run" (OuterVolumeSpecName: "var-run") pod "7cb4dab0-1ffc-49d4-a229-1862a33d4caa" (UID: "7cb4dab0-1ffc-49d4-a229-1862a33d4caa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-lib" (OuterVolumeSpecName: "var-lib") pod "7cb4dab0-1ffc-49d4-a229-1862a33d4caa" (UID: "7cb4dab0-1ffc-49d4-a229-1862a33d4caa"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863203 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-etc-ovs\") pod \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863249 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "7cb4dab0-1ffc-49d4-a229-1862a33d4caa" (UID: "7cb4dab0-1ffc-49d4-a229-1862a33d4caa"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863288 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-log\") pod \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863341 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-scripts\") pod \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\" (UID: \"7cb4dab0-1ffc-49d4-a229-1862a33d4caa\") " Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863383 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-log" (OuterVolumeSpecName: "var-log") pod "7cb4dab0-1ffc-49d4-a229-1862a33d4caa" (UID: "7cb4dab0-1ffc-49d4-a229-1862a33d4caa"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863620 4962 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-log\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863631 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863656 4962 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-var-lib\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.863663 4962 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.864458 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-scripts" (OuterVolumeSpecName: "scripts") pod "7cb4dab0-1ffc-49d4-a229-1862a33d4caa" (UID: "7cb4dab0-1ffc-49d4-a229-1862a33d4caa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.866331 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-kube-api-access-jmq2z" (OuterVolumeSpecName: "kube-api-access-jmq2z") pod "7cb4dab0-1ffc-49d4-a229-1862a33d4caa" (UID: "7cb4dab0-1ffc-49d4-a229-1862a33d4caa"). InnerVolumeSpecName "kube-api-access-jmq2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.919826 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvjpm_7cb4dab0-1ffc-49d4-a229-1862a33d4caa/ovs-vswitchd/0.log" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.920847 4962 generic.go:334] "Generic (PLEG): container finished" podID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" exitCode=137 Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.920871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvjpm" event={"ID":"7cb4dab0-1ffc-49d4-a229-1862a33d4caa","Type":"ContainerDied","Data":"923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac"} Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.920910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvjpm" event={"ID":"7cb4dab0-1ffc-49d4-a229-1862a33d4caa","Type":"ContainerDied","Data":"1e3bc9bb11b62b93d85f27f8127b612b45f887313c177783796dc37ea59289b2"} Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.920926 4962 scope.go:117] "RemoveContainer" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.920934 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wvjpm" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.929587 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b4b582ce-b618-4911-b554-f5cae9bcee91","Type":"ContainerDied","Data":"90a3666d7ec243c20d6f9c6167ca1f8679d60343366a66f8a480d13c588be0eb"} Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.929728 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.953184 4962 scope.go:117] "RemoveContainer" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.965547 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.965687 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmq2z\" (UniqueName: \"kubernetes.io/projected/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-kube-api-access-jmq2z\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: E1003 13:17:01.965784 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data podName:0ae87940-f07d-4213-bc0b-da0b3a2bba84 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:33.965768179 +0000 UTC m=+1662.369666014 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data") pod "barbican-worker-c97f5c65f-s279k" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84") : secret "barbican-config-data" not found Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.965818 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cb4dab0-1ffc-49d4-a229-1862a33d4caa-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.968136 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-wvjpm"] Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.976898 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-wvjpm"] Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.983018 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.987887 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 03 13:17:01 crc kubenswrapper[4962]: I1003 13:17:01.988453 4962 scope.go:117] "RemoveContainer" containerID="66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.014496 4962 scope.go:117] "RemoveContainer" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" Oct 03 13:17:02 crc kubenswrapper[4962]: E1003 13:17:02.015139 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac\": container with ID starting with 923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac not found: ID does not exist" containerID="923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.015172 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac"} err="failed to get container status \"923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac\": rpc error: code = NotFound desc = could not find container \"923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac\": container with ID starting with 923838580de863a9a099f0b6d817f70b752e252d832275cca5df8e77ca46e3ac not found: ID does not exist" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.015223 4962 scope.go:117] "RemoveContainer" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" Oct 03 13:17:02 crc kubenswrapper[4962]: E1003 13:17:02.015534 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9\": container with ID starting with 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 not found: ID does not exist" containerID="34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.015554 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9"} err="failed to get container status \"34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9\": rpc error: code = NotFound desc = could not find container \"34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9\": container with ID starting with 34c598e3d7d3c7d900220799762655d4114aec572ba13e62a9afb7fc335009e9 not found: ID does not exist" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.015567 4962 scope.go:117] "RemoveContainer" containerID="66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6" Oct 03 13:17:02 crc kubenswrapper[4962]: E1003 13:17:02.015810 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6\": container with ID starting with 66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6 not found: ID does not exist" containerID="66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.015836 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6"} err="failed to get container status \"66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6\": rpc error: code = NotFound desc = could not find container \"66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6\": container with ID starting with 66c05cecc27c12885de706425bd5259b80d56485b4bcf61ebe00d509c6d9c1b6 not found: ID does not exist" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.015849 4962 scope.go:117] "RemoveContainer" containerID="972fd0b549604163530a4df17ba0265931587abd268311d752380aa374952bb0" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.038912 4962 scope.go:117] "RemoveContainer" containerID="06a77f3c0c79be2df9a379f7d27bcc5d75db28ce32e20e774dd964566de558be" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.058150 4962 scope.go:117] "RemoveContainer" containerID="255dd1c4cb38e6b82f47f9c570da57cc07f7f5e8c11c54bb9966d8c730771ef6" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.073883 4962 scope.go:117] "RemoveContainer" containerID="054512bf0c273e329a55f18a262ffbcb7dd5abaded475341723d2b4dc5e849fb" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.088007 4962 scope.go:117] "RemoveContainer" containerID="7e76ff2eb3cf5160a1fdce8ab7db2a70edda0c5fb436d79cb130e11be846580e" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.103821 4962 scope.go:117] "RemoveContainer" containerID="f0c27459819cd1d481d672fbdd91f735b40d36cc170361880628b5b806924c13" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.121427 4962 scope.go:117] "RemoveContainer" containerID="a8806f325247419ebf9ee453e77f3493ec2be61562010341eec779899b644330" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.137152 4962 scope.go:117] "RemoveContainer" containerID="bccc65019a49e86470400d5863a1f0f3a4c53c7dead1edd7e4226173d7443ed2" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.154207 4962 scope.go:117] "RemoveContainer" containerID="7ce66520fa57254d5157448844d739ec610586be59fb60789632b9b85bd02222" Oct 03 13:17:02 crc kubenswrapper[4962]: E1003 13:17:02.169447 4962 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Oct 03 13:17:02 crc kubenswrapper[4962]: E1003 13:17:02.169527 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:34.16950682 +0000 UTC m=+1662.573404655 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-keystone-listener-config-data" not found Oct 03 13:17:02 crc kubenswrapper[4962]: E1003 13:17:02.169549 4962 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 03 13:17:02 crc kubenswrapper[4962]: E1003 13:17:02.169608 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data podName:e6f0fc0a-ae8e-445e-ad05-591b7ab00886 nodeName:}" failed. No retries permitted until 2025-10-03 13:17:34.169591282 +0000 UTC m=+1662.573489137 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data") pod "barbican-keystone-listener-59bf856dfd-t86xg" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886") : secret "barbican-config-data" not found Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.172108 4962 scope.go:117] "RemoveContainer" containerID="d07a3c21e9f7cc962ded5767c572003a88953fe191cf331895a5e9e48103288b" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.195002 4962 scope.go:117] "RemoveContainer" containerID="0027d40b3fd7f4cac601a15c7999e155ece2e4687617a83b85e72dd63015f85e" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.217293 4962 scope.go:117] "RemoveContainer" containerID="5674271fdefce19a13c9b336c52d013de2b33a9a9124fca33358f8e3a0cf5881" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.235007 4962 scope.go:117] "RemoveContainer" containerID="770e06f348aaf3989bf45ae8703e1cff216acdce48c3de88da4323e4ade168ff" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.236099 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" path="/var/lib/kubelet/pods/7cb4dab0-1ffc-49d4-a229-1862a33d4caa/volumes" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.236905 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" path="/var/lib/kubelet/pods/b4b582ce-b618-4911-b554-f5cae9bcee91/volumes" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.255748 4962 scope.go:117] "RemoveContainer" containerID="33714c39a9c3c16769f323fb660866cd0b5a9c6bf72751670fa0465d513c70cb" Oct 03 13:17:02 crc kubenswrapper[4962]: I1003 13:17:02.272796 4962 scope.go:117] "RemoveContainer" containerID="1953908fb8f3a3d9cd983f3f51df79d091442b3b2351b35d0c858fe9e4a4b278" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.333052 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi6963-account-delete-mg78g" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.397603 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.457116 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.524456 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9xsp\" (UniqueName: \"kubernetes.io/projected/0ae87940-f07d-4213-bc0b-da0b3a2bba84-kube-api-access-l9xsp\") pod \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.524582 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data-custom\") pod \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.524621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-combined-ca-bundle\") pod \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.524666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data\") pod \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.524826 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae87940-f07d-4213-bc0b-da0b3a2bba84-logs\") pod \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\" (UID: \"0ae87940-f07d-4213-bc0b-da0b3a2bba84\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.524851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpv8s\" (UniqueName: \"kubernetes.io/projected/1b763061-bb23-4c23-a4ec-bebac231c603-kube-api-access-dpv8s\") pod \"1b763061-bb23-4c23-a4ec-bebac231c603\" (UID: \"1b763061-bb23-4c23-a4ec-bebac231c603\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.524872 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-logs\") pod \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.525142 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae87940-f07d-4213-bc0b-da0b3a2bba84-logs" (OuterVolumeSpecName: "logs") pod "0ae87940-f07d-4213-bc0b-da0b3a2bba84" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.525380 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-logs" (OuterVolumeSpecName: "logs") pod "e6f0fc0a-ae8e-445e-ad05-591b7ab00886" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.529431 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ae87940-f07d-4213-bc0b-da0b3a2bba84" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.529611 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae87940-f07d-4213-bc0b-da0b3a2bba84-kube-api-access-l9xsp" (OuterVolumeSpecName: "kube-api-access-l9xsp") pod "0ae87940-f07d-4213-bc0b-da0b3a2bba84" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84"). InnerVolumeSpecName "kube-api-access-l9xsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.529842 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b763061-bb23-4c23-a4ec-bebac231c603-kube-api-access-dpv8s" (OuterVolumeSpecName: "kube-api-access-dpv8s") pod "1b763061-bb23-4c23-a4ec-bebac231c603" (UID: "1b763061-bb23-4c23-a4ec-bebac231c603"). InnerVolumeSpecName "kube-api-access-dpv8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.545170 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ae87940-f07d-4213-bc0b-da0b3a2bba84" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.559013 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data" (OuterVolumeSpecName: "config-data") pod "0ae87940-f07d-4213-bc0b-da0b3a2bba84" (UID: "0ae87940-f07d-4213-bc0b-da0b3a2bba84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.585711 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6456949cf6-r4n9q" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.145:8778/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.586214 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6456949cf6-r4n9q" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.145:8778/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.626050 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-combined-ca-bundle\") pod \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.626384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data\") pod \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.626485 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom\") pod \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.626680 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cr49\" (UniqueName: \"kubernetes.io/projected/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-kube-api-access-7cr49\") pod \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\" (UID: \"e6f0fc0a-ae8e-445e-ad05-591b7ab00886\") " Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.627093 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae87940-f07d-4213-bc0b-da0b3a2bba84-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.627195 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpv8s\" (UniqueName: \"kubernetes.io/projected/1b763061-bb23-4c23-a4ec-bebac231c603-kube-api-access-dpv8s\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.627304 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-logs\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.627424 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9xsp\" (UniqueName: \"kubernetes.io/projected/0ae87940-f07d-4213-bc0b-da0b3a2bba84-kube-api-access-l9xsp\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.627539 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.627692 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.627860 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae87940-f07d-4213-bc0b-da0b3a2bba84-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.629768 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e6f0fc0a-ae8e-445e-ad05-591b7ab00886" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.630053 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-kube-api-access-7cr49" (OuterVolumeSpecName: "kube-api-access-7cr49") pod "e6f0fc0a-ae8e-445e-ad05-591b7ab00886" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886"). InnerVolumeSpecName "kube-api-access-7cr49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.641848 4962 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poddab0e7ec-9c64-491d-a655-027098042378"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poddab0e7ec-9c64-491d-a655-027098042378] : Timed out while waiting for systemd to remove kubepods-besteffort-poddab0e7ec_9c64_491d_a655_027098042378.slice" Oct 03 13:17:05 crc kubenswrapper[4962]: E1003 13:17:05.641895 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort poddab0e7ec-9c64-491d-a655-027098042378] : unable to destroy cgroup paths for cgroup [kubepods besteffort poddab0e7ec-9c64-491d-a655-027098042378] : Timed out while waiting for systemd to remove kubepods-besteffort-poddab0e7ec_9c64_491d_a655_027098042378.slice" pod="openstack/barbican-api-7c44799d88-mmmm6" podUID="dab0e7ec-9c64-491d-a655-027098042378" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.644481 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6f0fc0a-ae8e-445e-ad05-591b7ab00886" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.658527 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data" (OuterVolumeSpecName: "config-data") pod "e6f0fc0a-ae8e-445e-ad05-591b7ab00886" (UID: "e6f0fc0a-ae8e-445e-ad05-591b7ab00886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.728703 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.728738 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.728751 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.728765 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cr49\" (UniqueName: \"kubernetes.io/projected/e6f0fc0a-ae8e-445e-ad05-591b7ab00886-kube-api-access-7cr49\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.970756 4962 generic.go:334] "Generic (PLEG): container finished" podID="1b763061-bb23-4c23-a4ec-bebac231c603" containerID="a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d" exitCode=137 Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.970863 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi6963-account-delete-mg78g" Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.970811 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi6963-account-delete-mg78g" event={"ID":"1b763061-bb23-4c23-a4ec-bebac231c603","Type":"ContainerDied","Data":"a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d"} Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.971169 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi6963-account-delete-mg78g" event={"ID":"1b763061-bb23-4c23-a4ec-bebac231c603","Type":"ContainerDied","Data":"0a87238a1f9f9b006b3fcededc25d64aae5534de53bb9a57f44dd86afd263a1c"} Oct 03 13:17:05 crc kubenswrapper[4962]: I1003 13:17:05.971234 4962 scope.go:117] "RemoveContainer" containerID="a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.020604 4962 generic.go:334] "Generic (PLEG): container finished" podID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerID="442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37" exitCode=137 Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.020684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c97f5c65f-s279k" event={"ID":"0ae87940-f07d-4213-bc0b-da0b3a2bba84","Type":"ContainerDied","Data":"442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37"} Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.020715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c97f5c65f-s279k" event={"ID":"0ae87940-f07d-4213-bc0b-da0b3a2bba84","Type":"ContainerDied","Data":"8428c1bd276857bc0ab39f071664e6f6e2b7c216f27e36d161897444b38331fd"} Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.020776 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c97f5c65f-s279k" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.023094 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi6963-account-delete-mg78g"] Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.024687 4962 generic.go:334] "Generic (PLEG): container finished" podID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerID="7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a" exitCode=137 Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.024754 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c44799d88-mmmm6" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.025204 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.026317 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" event={"ID":"e6f0fc0a-ae8e-445e-ad05-591b7ab00886","Type":"ContainerDied","Data":"7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a"} Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.026612 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59bf856dfd-t86xg" event={"ID":"e6f0fc0a-ae8e-445e-ad05-591b7ab00886","Type":"ContainerDied","Data":"06506c18461a638a0a9db0ec818f7f15103331fa5960e8e6311c11eeb4b075e7"} Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.033102 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi6963-account-delete-mg78g"] Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.045022 4962 scope.go:117] "RemoveContainer" containerID="a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d" Oct 03 13:17:06 crc kubenswrapper[4962]: E1003 13:17:06.045461 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d\": container with ID starting with a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d not found: ID does not exist" containerID="a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.045508 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d"} err="failed to get container status \"a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d\": rpc error: code = NotFound desc = could not find container \"a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d\": container with ID starting with a1e1574d72e2bf15c562df1b3f61c5008f261e0102ffde7db224300931897e5d not found: ID does not exist" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.045535 4962 scope.go:117] "RemoveContainer" containerID="442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.081927 4962 scope.go:117] "RemoveContainer" containerID="ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.089752 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c44799d88-mmmm6"] Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.095341 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c44799d88-mmmm6"] Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.103367 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59bf856dfd-t86xg"] Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.108902 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-59bf856dfd-t86xg"] Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.113319 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-c97f5c65f-s279k"] Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.113502 4962 scope.go:117] "RemoveContainer" containerID="442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37" Oct 03 13:17:06 crc kubenswrapper[4962]: E1003 13:17:06.114026 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37\": container with ID starting with 442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37 not found: ID does not exist" containerID="442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.114066 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37"} err="failed to get container status \"442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37\": rpc error: code = NotFound desc = could not find container \"442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37\": container with ID starting with 442d38894b6e701167727dc5d7828dee1159daaa23c0b84571b21a9f1e1aac37 not found: ID does not exist" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.114092 4962 scope.go:117] "RemoveContainer" containerID="ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4" Oct 03 13:17:06 crc kubenswrapper[4962]: E1003 13:17:06.114480 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4\": container with ID starting with ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4 not found: ID does not exist" containerID="ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.114520 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4"} err="failed to get container status \"ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4\": rpc error: code = NotFound desc = could not find container \"ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4\": container with ID starting with ebd303224ee1ba6219baa644b264ad83840c2dfc146c5958bf9efcdb7bba20b4 not found: ID does not exist" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.114545 4962 scope.go:117] "RemoveContainer" containerID="7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.117904 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-c97f5c65f-s279k"] Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.131151 4962 scope.go:117] "RemoveContainer" containerID="e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.133313 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bx2f\" (UniqueName: \"kubernetes.io/projected/dab0e7ec-9c64-491d-a655-027098042378-kube-api-access-9bx2f\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.133338 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab0e7ec-9c64-491d-a655-027098042378-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.154296 4962 scope.go:117] "RemoveContainer" containerID="7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a" Oct 03 13:17:06 crc kubenswrapper[4962]: E1003 13:17:06.154762 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a\": container with ID starting with 7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a not found: ID does not exist" containerID="7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.154798 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a"} err="failed to get container status \"7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a\": rpc error: code = NotFound desc = could not find container \"7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a\": container with ID starting with 7c748f0bb3a7a35ffcd899979db48222f05a9821400ac3e3dccbbd992dcd6a1a not found: ID does not exist" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.154824 4962 scope.go:117] "RemoveContainer" containerID="e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73" Oct 03 13:17:06 crc kubenswrapper[4962]: E1003 13:17:06.155171 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73\": container with ID starting with e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73 not found: ID does not exist" containerID="e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.155201 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73"} err="failed to get container status \"e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73\": rpc error: code = NotFound desc = could not find container \"e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73\": container with ID starting with e9d5e1c76351743f9337f95aaf1cd1c6eb52345b3f44a6356123c4643e580e73 not found: ID does not exist" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.172540 4962 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod438da193-7b02-4101-a45c-9e0f83c41051"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod438da193-7b02-4101-a45c-9e0f83c41051] : Timed out while waiting for systemd to remove kubepods-besteffort-pod438da193_7b02_4101_a45c_9e0f83c41051.slice" Oct 03 13:17:06 crc kubenswrapper[4962]: E1003 13:17:06.172578 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod438da193-7b02-4101-a45c-9e0f83c41051] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod438da193-7b02-4101-a45c-9e0f83c41051] : Timed out while waiting for systemd to remove kubepods-besteffort-pod438da193_7b02_4101_a45c_9e0f83c41051.slice" pod="openstack/openstack-cell1-galera-0" podUID="438da193-7b02-4101-a45c-9e0f83c41051" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.238229 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" path="/var/lib/kubelet/pods/0ae87940-f07d-4213-bc0b-da0b3a2bba84/volumes" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.238860 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b763061-bb23-4c23-a4ec-bebac231c603" path="/var/lib/kubelet/pods/1b763061-bb23-4c23-a4ec-bebac231c603/volumes" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.239367 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab0e7ec-9c64-491d-a655-027098042378" path="/var/lib/kubelet/pods/dab0e7ec-9c64-491d-a655-027098042378/volumes" Oct 03 13:17:06 crc kubenswrapper[4962]: I1003 13:17:06.239785 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" path="/var/lib/kubelet/pods/e6f0fc0a-ae8e-445e-ad05-591b7ab00886/volumes" Oct 03 13:17:07 crc kubenswrapper[4962]: I1003 13:17:07.039179 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 13:17:07 crc kubenswrapper[4962]: I1003 13:17:07.085598 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 13:17:07 crc kubenswrapper[4962]: I1003 13:17:07.091627 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 13:17:08 crc kubenswrapper[4962]: I1003 13:17:08.240435 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438da193-7b02-4101-a45c-9e0f83c41051" path="/var/lib/kubelet/pods/438da193-7b02-4101-a45c-9e0f83c41051/volumes" Oct 03 13:17:10 crc kubenswrapper[4962]: I1003 13:17:10.226914 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:17:10 crc kubenswrapper[4962]: E1003 13:17:10.227153 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:17:24 crc kubenswrapper[4962]: I1003 13:17:24.227468 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:17:24 crc kubenswrapper[4962]: E1003 13:17:24.228925 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:17:38 crc kubenswrapper[4962]: I1003 13:17:38.226576 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:17:38 crc kubenswrapper[4962]: E1003 13:17:38.227297 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:17:49 crc kubenswrapper[4962]: I1003 13:17:49.226827 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:17:49 crc kubenswrapper[4962]: E1003 13:17:49.227572 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:17:56 crc kubenswrapper[4962]: I1003 13:17:56.704269 4962 scope.go:117] "RemoveContainer" containerID="706277ad3b96d3b0d8687160b11c07b80babd5d0827c39c2e0bc9cc9e42f7d03" Oct 03 13:17:56 crc kubenswrapper[4962]: I1003 13:17:56.729884 4962 scope.go:117] "RemoveContainer" containerID="016f210b8c91d8e9c5f93eeed437c69115693182129858f3582cdcd912c4dd79" Oct 03 13:17:56 crc kubenswrapper[4962]: I1003 13:17:56.747113 4962 scope.go:117] "RemoveContainer" containerID="cc095d6f6d8b5824a32ad688e66fd5a34700bd68a8048d5bb8c9727930860221" Oct 03 13:18:02 crc kubenswrapper[4962]: I1003 13:18:02.231032 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:18:02 crc kubenswrapper[4962]: E1003 13:18:02.232121 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.634542 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cx8j9"] Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635457 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerName="barbican-keystone-listener-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635487 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerName="barbican-keystone-listener-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635503 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635510 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635530 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635538 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635554 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerName="glance-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635561 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerName="glance-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635572 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfeca1b1-fa87-4490-9e99-38e60d421138" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635581 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfeca1b1-fa87-4490-9e99-38e60d421138" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635595 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e6592a-d206-4931-aa99-a84e041b05e4" containerName="memcached" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635603 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e6592a-d206-4931-aa99-a84e041b05e4" containerName="memcached" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635614 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fb0456-394e-4041-829b-57c162966b2b" containerName="mysql-bootstrap" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635622 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fb0456-394e-4041-829b-57c162966b2b" containerName="mysql-bootstrap" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635630 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerName="ovsdbserver-nb" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635655 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerName="ovsdbserver-nb" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635671 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="ceilometer-notification-agent" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635681 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="ceilometer-notification-agent" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635704 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fb0456-394e-4041-829b-57c162966b2b" containerName="galera" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635712 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fb0456-394e-4041-829b-57c162966b2b" containerName="galera" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635725 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-metadata" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635746 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-metadata" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635758 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerName="rabbitmq" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635766 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerName="rabbitmq" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635779 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-reaper" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635788 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-reaper" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635802 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0da1427-1e89-42d6-beb2-55f292945177" containerName="glance-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635810 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0da1427-1e89-42d6-beb2-55f292945177" containerName="glance-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635817 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerName="barbican-worker-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635825 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerName="barbican-worker-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635837 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635845 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635857 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="ceilometer-central-agent" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635865 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="ceilometer-central-agent" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635874 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635881 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-api" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635896 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438da193-7b02-4101-a45c-9e0f83c41051" containerName="mysql-bootstrap" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635904 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="438da193-7b02-4101-a45c-9e0f83c41051" containerName="mysql-bootstrap" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635917 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerName="setup-container" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635925 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerName="setup-container" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635936 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="ovsdbserver-sb" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635943 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="ovsdbserver-sb" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635956 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerName="glance-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635965 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerName="glance-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635973 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b763061-bb23-4c23-a4ec-bebac231c603" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.635980 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b763061-bb23-4c23-a4ec-bebac231c603" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.635994 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636001 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636012 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636019 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636028 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36308a0-1b17-4986-adb2-2833b444a239" containerName="nova-scheduler-scheduler" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636036 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36308a0-1b17-4986-adb2-2833b444a239" containerName="nova-scheduler-scheduler" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636048 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerName="barbican-worker" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636057 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerName="barbican-worker" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636065 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56923e91-36c0-432d-8042-138d2e89eb3b" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636072 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="56923e91-36c0-432d-8042-138d2e89eb3b" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636081 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerName="ovn-controller" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636089 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerName="ovn-controller" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636102 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636110 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-server" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636122 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636132 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636147 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72df0792-9904-4b64-9c70-37cb982fe24b" containerName="kube-state-metrics" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636157 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="72df0792-9904-4b64-9c70-37cb982fe24b" containerName="kube-state-metrics" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636173 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" containerName="dnsmasq-dns" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636182 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" containerName="dnsmasq-dns" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636195 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636203 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636214 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="sg-core" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636227 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="sg-core" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636240 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerName="probe" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636250 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerName="probe" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636262 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerName="barbican-keystone-listener-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636270 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerName="barbican-keystone-listener-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636285 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerName="barbican-keystone-listener" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636294 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerName="barbican-keystone-listener" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636310 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636318 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636331 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-updater" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636341 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-updater" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636354 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636364 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636383 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-updater" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636397 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-updater" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636416 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636424 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636433 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="ovn-northd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636441 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="ovn-northd" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636454 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="proxy-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636462 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="proxy-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636473 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636480 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636493 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dc7e17-4436-4452-a266-65d57a67779d" containerName="neutron-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636501 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dc7e17-4436-4452-a266-65d57a67779d" containerName="neutron-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636509 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="rsync" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636516 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="rsync" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636528 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09f26ad-247c-477a-9d73-a2a0f8df91e8" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636536 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09f26ad-247c-477a-9d73-a2a0f8df91e8" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636549 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636558 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-api" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636570 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerName="cinder-scheduler" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636578 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerName="cinder-scheduler" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636587 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22955d6-a957-458f-8181-5fea18cedc90" containerName="nova-cell1-conductor-conductor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636595 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22955d6-a957-458f-8181-5fea18cedc90" containerName="nova-cell1-conductor-conductor" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636604 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636611 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-server" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636621 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636629 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636664 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerName="rabbitmq" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636675 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerName="rabbitmq" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636689 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerName="barbican-worker-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636699 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerName="barbican-worker-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636709 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438da193-7b02-4101-a45c-9e0f83c41051" containerName="galera" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636718 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="438da193-7b02-4101-a45c-9e0f83c41051" containerName="galera" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636732 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636739 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636753 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dc7e17-4436-4452-a266-65d57a67779d" containerName="neutron-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636760 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dc7e17-4436-4452-a266-65d57a67779d" containerName="neutron-api" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636769 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636777 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636787 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636795 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-server" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636809 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e098e6f-ec3b-41e6-b179-6c196ad1fe49" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636816 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e098e6f-ec3b-41e6-b179-6c196ad1fe49" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636828 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636836 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-server" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636845 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-expirer" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636852 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-expirer" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636862 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636870 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636879 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerName="barbican-keystone-listener" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636887 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerName="barbican-keystone-listener" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636899 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636906 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636917 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42a368b-6dd4-4bb0-83a8-d79138605ec9" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636925 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42a368b-6dd4-4bb0-83a8-d79138605ec9" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636934 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ea0653-966b-47ff-b8aa-b6ad2b5810ca" containerName="nova-cell0-conductor-conductor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636942 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ea0653-966b-47ff-b8aa-b6ad2b5810ca" containerName="nova-cell0-conductor-conductor" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636959 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b77bc9-27ae-4994-86c2-614e48ad33c6" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636966 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b77bc9-27ae-4994-86c2-614e48ad33c6" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636975 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0da1427-1e89-42d6-beb2-55f292945177" containerName="glance-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636983 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0da1427-1e89-42d6-beb2-55f292945177" containerName="glance-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.636992 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server-init" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.636999 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server-init" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637009 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerName="barbican-worker" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637017 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerName="barbican-worker" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637029 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637036 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-log" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637046 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerName="setup-container" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637053 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerName="setup-container" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637061 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637068 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637079 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85710c21-98fe-4148-8ef1-ec9f4e9ef311" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637086 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="85710c21-98fe-4148-8ef1-ec9f4e9ef311" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637097 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637104 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637118 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cba65d-0ae5-4a81-88c1-da4e07d7a803" containerName="keystone-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637125 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cba65d-0ae5-4a81-88c1-da4e07d7a803" containerName="keystone-api" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637137 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="swift-recon-cron" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637144 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="swift-recon-cron" Oct 03 13:18:09 crc kubenswrapper[4962]: E1003 13:18:09.637157 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" containerName="init" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637164 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" containerName="init" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637317 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="862ad9df-af58-4304-9ad5-7faba334e2d9" containerName="rabbitmq" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637332 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="swift-recon-cron" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637348 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637364 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b77bc9-27ae-4994-86c2-614e48ad33c6" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637374 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ea0653-966b-47ff-b8aa-b6ad2b5810ca" containerName="nova-cell0-conductor-conductor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637385 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-updater" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637397 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637410 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerName="probe" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637419 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerName="barbican-keystone-listener" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637432 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="ovsdbserver-sb" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637448 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovsdb-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637463 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dc7e17-4436-4452-a266-65d57a67779d" containerName="neutron-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637475 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b763061-bb23-4c23-a4ec-bebac231c603" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637495 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="438da193-7b02-4101-a45c-9e0f83c41051" containerName="galera" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637512 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1289d443-56d2-4f63-8802-66bcd0569b3b" containerName="placement-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637529 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0da1427-1e89-42d6-beb2-55f292945177" containerName="glance-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637540 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerName="glance-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637553 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="32e6592a-d206-4931-aa99-a84e041b05e4" containerName="memcached" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637564 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637579 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637593 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-metadata" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637605 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb4dab0-1ffc-49d4-a229-1862a33d4caa" containerName="ovs-vswitchd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637615 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="ovn-northd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637626 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637814 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-reaper" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637828 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0da1427-1e89-42d6-beb2-55f292945177" containerName="glance-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637858 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637869 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerName="barbican-keystone-listener-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637883 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-updater" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637897 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42a368b-6dd4-4bb0-83a8-d79138605ec9" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637907 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="rsync" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637918 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637928 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="009b2959-1113-4574-a2ec-90bbe2d8f8ef" containerName="cinder-scheduler" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637936 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerName="barbican-worker-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637948 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="221bdd26-0fec-49e5-86ec-c2aefe7a5902" containerName="rabbitmq" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637961 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d329c4da-aa05-4c80-ab30-622eac56428a" containerName="nova-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637975 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerName="barbican-worker-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637986 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="account-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.637999 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dc7e17-4436-4452-a266-65d57a67779d" containerName="neutron-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638008 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd269d6d-5aa2-43c0-a23b-e76b52699d59" containerName="nova-metadata-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638015 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f0fc0a-ae8e-445e-ad05-591b7ab00886" containerName="barbican-keystone-listener" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638028 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="85710c21-98fe-4148-8ef1-ec9f4e9ef311" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638037 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f2e935-e9b5-49ab-8a2a-30b15840bae9" containerName="proxy-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638048 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="ceilometer-notification-agent" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638056 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6f62dd-0720-46b6-b0a8-497490f052a8" containerName="ovn-controller" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638066 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae29e17-1d99-4401-a317-9c8b7be58a3c" containerName="cinder-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638076 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6313803e-1bf1-4a99-8af7-cb80c0e6321c" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638088 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638097 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea3d32c-24c3-4a80-a1fb-ad65be7bbba6" containerName="glance-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638106 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae87940-f07d-4213-bc0b-da0b3a2bba84" containerName="barbican-worker" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638113 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af174c7-cf23-452c-bc13-ecda2775d58d" containerName="ovsdbserver-nb" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638122 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3fb0456-394e-4041-829b-57c162966b2b" containerName="galera" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638130 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c876ef6-c8ab-44d1-9ba4-07f0b5e07695" containerName="openstack-network-exporter" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638141 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22955d6-a957-458f-8181-5fea18cedc90" containerName="nova-cell1-conductor-conductor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638151 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638163 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="proxy-httpd" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638175 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb190059-74a6-4ffe-88a4-5fcfd46812a0" containerName="dnsmasq-dns" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638183 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="ceilometer-central-agent" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638191 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="56923e91-36c0-432d-8042-138d2e89eb3b" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638200 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09f26ad-247c-477a-9d73-a2a0f8df91e8" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638213 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638222 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6cba65d-0ae5-4a81-88c1-da4e07d7a803" containerName="keystone-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638234 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638247 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36308a0-1b17-4986-adb2-2833b444a239" containerName="nova-scheduler-scheduler" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638257 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="72df0792-9904-4b64-9c70-37cb982fe24b" containerName="kube-state-metrics" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638265 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfeca1b1-fa87-4490-9e99-38e60d421138" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638275 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e098e6f-ec3b-41e6-b179-6c196ad1fe49" containerName="mariadb-account-delete" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638308 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56a1d1d-7e30-4bb8-a5a7-068afc055cb8" containerName="barbican-api" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638317 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecb3944-c441-4879-8220-aa32d7436c1f" containerName="barbican-worker" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638324 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-replicator" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638333 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="object-expirer" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638345 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-server" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638354 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b582ce-b618-4911-b554-f5cae9bcee91" containerName="container-auditor" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638363 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ad9c69-05d8-4b75-82cc-f23f6303d7d7" containerName="sg-core" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.638371 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c111271-43ed-48b3-b6ed-a6d02efb9113" containerName="barbican-keystone-listener-log" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.639610 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.648325 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cx8j9"] Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.708414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckr2z\" (UniqueName: \"kubernetes.io/projected/2665ea2d-bc10-47be-b310-696c8e17c696-kube-api-access-ckr2z\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.708489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-catalog-content\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.708725 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-utilities\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.810434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-catalog-content\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.810498 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-utilities\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.810665 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckr2z\" (UniqueName: \"kubernetes.io/projected/2665ea2d-bc10-47be-b310-696c8e17c696-kube-api-access-ckr2z\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.811082 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-catalog-content\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.811147 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-utilities\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.830261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckr2z\" (UniqueName: \"kubernetes.io/projected/2665ea2d-bc10-47be-b310-696c8e17c696-kube-api-access-ckr2z\") pod \"redhat-operators-cx8j9\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.831791 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m4g4s"] Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.833480 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.846993 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m4g4s"] Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.911694 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-utilities\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.912060 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkll8\" (UniqueName: \"kubernetes.io/projected/45e53a8c-30db-4179-aedd-591b2cae9c7a-kube-api-access-xkll8\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.912126 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-catalog-content\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:09 crc kubenswrapper[4962]: I1003 13:18:09.980518 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.013889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkll8\" (UniqueName: \"kubernetes.io/projected/45e53a8c-30db-4179-aedd-591b2cae9c7a-kube-api-access-xkll8\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.014025 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-catalog-content\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.015023 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-utilities\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.014877 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-catalog-content\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.015607 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-utilities\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.038088 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkll8\" (UniqueName: \"kubernetes.io/projected/45e53a8c-30db-4179-aedd-591b2cae9c7a-kube-api-access-xkll8\") pod \"certified-operators-m4g4s\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.180792 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.441027 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m4g4s"] Oct 03 13:18:10 crc kubenswrapper[4962]: W1003 13:18:10.448613 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45e53a8c_30db_4179_aedd_591b2cae9c7a.slice/crio-fbe2f380ff36450a34ccb17d8b9ffcd944172ecde73e1cfa8db325b308b92d1e WatchSource:0}: Error finding container fbe2f380ff36450a34ccb17d8b9ffcd944172ecde73e1cfa8db325b308b92d1e: Status 404 returned error can't find the container with id fbe2f380ff36450a34ccb17d8b9ffcd944172ecde73e1cfa8db325b308b92d1e Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.486674 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cx8j9"] Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.645860 4962 generic.go:334] "Generic (PLEG): container finished" podID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerID="2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436" exitCode=0 Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.645928 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4g4s" event={"ID":"45e53a8c-30db-4179-aedd-591b2cae9c7a","Type":"ContainerDied","Data":"2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436"} Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.645956 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4g4s" event={"ID":"45e53a8c-30db-4179-aedd-591b2cae9c7a","Type":"ContainerStarted","Data":"fbe2f380ff36450a34ccb17d8b9ffcd944172ecde73e1cfa8db325b308b92d1e"} Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.647917 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.648122 4962 generic.go:334] "Generic (PLEG): container finished" podID="2665ea2d-bc10-47be-b310-696c8e17c696" containerID="6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2" exitCode=0 Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.648159 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8j9" event={"ID":"2665ea2d-bc10-47be-b310-696c8e17c696","Type":"ContainerDied","Data":"6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2"} Oct 03 13:18:10 crc kubenswrapper[4962]: I1003 13:18:10.648181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8j9" event={"ID":"2665ea2d-bc10-47be-b310-696c8e17c696","Type":"ContainerStarted","Data":"164330af931c603b7587e61e24ad82e5cf1c879ef5080af0caf887f16b2e8b1d"} Oct 03 13:18:11 crc kubenswrapper[4962]: I1003 13:18:11.672531 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4g4s" event={"ID":"45e53a8c-30db-4179-aedd-591b2cae9c7a","Type":"ContainerStarted","Data":"82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5"} Oct 03 13:18:12 crc kubenswrapper[4962]: I1003 13:18:12.685864 4962 generic.go:334] "Generic (PLEG): container finished" podID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerID="82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5" exitCode=0 Oct 03 13:18:12 crc kubenswrapper[4962]: I1003 13:18:12.685988 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4g4s" event={"ID":"45e53a8c-30db-4179-aedd-591b2cae9c7a","Type":"ContainerDied","Data":"82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5"} Oct 03 13:18:12 crc kubenswrapper[4962]: I1003 13:18:12.689526 4962 generic.go:334] "Generic (PLEG): container finished" podID="2665ea2d-bc10-47be-b310-696c8e17c696" containerID="2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa" exitCode=0 Oct 03 13:18:12 crc kubenswrapper[4962]: I1003 13:18:12.689584 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8j9" event={"ID":"2665ea2d-bc10-47be-b310-696c8e17c696","Type":"ContainerDied","Data":"2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa"} Oct 03 13:18:13 crc kubenswrapper[4962]: I1003 13:18:13.702170 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4g4s" event={"ID":"45e53a8c-30db-4179-aedd-591b2cae9c7a","Type":"ContainerStarted","Data":"d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67"} Oct 03 13:18:13 crc kubenswrapper[4962]: I1003 13:18:13.705543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8j9" event={"ID":"2665ea2d-bc10-47be-b310-696c8e17c696","Type":"ContainerStarted","Data":"6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443"} Oct 03 13:18:13 crc kubenswrapper[4962]: I1003 13:18:13.724020 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m4g4s" podStartSLOduration=2.034609145 podStartE2EDuration="4.72400176s" podCreationTimestamp="2025-10-03 13:18:09 +0000 UTC" firstStartedPulling="2025-10-03 13:18:10.647230965 +0000 UTC m=+1699.051128800" lastFinishedPulling="2025-10-03 13:18:13.33662354 +0000 UTC m=+1701.740521415" observedRunningTime="2025-10-03 13:18:13.721072801 +0000 UTC m=+1702.124970636" watchObservedRunningTime="2025-10-03 13:18:13.72400176 +0000 UTC m=+1702.127899595" Oct 03 13:18:13 crc kubenswrapper[4962]: I1003 13:18:13.739132 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cx8j9" podStartSLOduration=2.310407914 podStartE2EDuration="4.739110556s" podCreationTimestamp="2025-10-03 13:18:09 +0000 UTC" firstStartedPulling="2025-10-03 13:18:10.650116513 +0000 UTC m=+1699.054014348" lastFinishedPulling="2025-10-03 13:18:13.078819155 +0000 UTC m=+1701.482716990" observedRunningTime="2025-10-03 13:18:13.737963846 +0000 UTC m=+1702.141861681" watchObservedRunningTime="2025-10-03 13:18:13.739110556 +0000 UTC m=+1702.143008391" Oct 03 13:18:14 crc kubenswrapper[4962]: I1003 13:18:14.227009 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:18:14 crc kubenswrapper[4962]: E1003 13:18:14.227271 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:18:19 crc kubenswrapper[4962]: I1003 13:18:19.981124 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:19 crc kubenswrapper[4962]: I1003 13:18:19.981620 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:20 crc kubenswrapper[4962]: I1003 13:18:20.026361 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:20 crc kubenswrapper[4962]: I1003 13:18:20.181215 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:20 crc kubenswrapper[4962]: I1003 13:18:20.181494 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:20 crc kubenswrapper[4962]: I1003 13:18:20.235980 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:20 crc kubenswrapper[4962]: I1003 13:18:20.812002 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:20 crc kubenswrapper[4962]: I1003 13:18:20.828208 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:21 crc kubenswrapper[4962]: I1003 13:18:21.857238 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m4g4s"] Oct 03 13:18:22 crc kubenswrapper[4962]: I1003 13:18:22.860339 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cx8j9"] Oct 03 13:18:22 crc kubenswrapper[4962]: I1003 13:18:22.860697 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cx8j9" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" containerName="registry-server" containerID="cri-o://6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443" gracePeriod=2 Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.751139 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.800394 4962 generic.go:334] "Generic (PLEG): container finished" podID="2665ea2d-bc10-47be-b310-696c8e17c696" containerID="6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443" exitCode=0 Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.800455 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx8j9" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.800459 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8j9" event={"ID":"2665ea2d-bc10-47be-b310-696c8e17c696","Type":"ContainerDied","Data":"6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443"} Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.800502 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx8j9" event={"ID":"2665ea2d-bc10-47be-b310-696c8e17c696","Type":"ContainerDied","Data":"164330af931c603b7587e61e24ad82e5cf1c879ef5080af0caf887f16b2e8b1d"} Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.800523 4962 scope.go:117] "RemoveContainer" containerID="6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.800597 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m4g4s" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerName="registry-server" containerID="cri-o://d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67" gracePeriod=2 Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.821123 4962 scope.go:117] "RemoveContainer" containerID="2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.829120 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-catalog-content\") pod \"2665ea2d-bc10-47be-b310-696c8e17c696\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.829155 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-utilities\") pod \"2665ea2d-bc10-47be-b310-696c8e17c696\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.829206 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckr2z\" (UniqueName: \"kubernetes.io/projected/2665ea2d-bc10-47be-b310-696c8e17c696-kube-api-access-ckr2z\") pod \"2665ea2d-bc10-47be-b310-696c8e17c696\" (UID: \"2665ea2d-bc10-47be-b310-696c8e17c696\") " Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.847352 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2665ea2d-bc10-47be-b310-696c8e17c696-kube-api-access-ckr2z" (OuterVolumeSpecName: "kube-api-access-ckr2z") pod "2665ea2d-bc10-47be-b310-696c8e17c696" (UID: "2665ea2d-bc10-47be-b310-696c8e17c696"). InnerVolumeSpecName "kube-api-access-ckr2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.848572 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-utilities" (OuterVolumeSpecName: "utilities") pod "2665ea2d-bc10-47be-b310-696c8e17c696" (UID: "2665ea2d-bc10-47be-b310-696c8e17c696"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.864274 4962 scope.go:117] "RemoveContainer" containerID="6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.930713 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.930971 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckr2z\" (UniqueName: \"kubernetes.io/projected/2665ea2d-bc10-47be-b310-696c8e17c696-kube-api-access-ckr2z\") on node \"crc\" DevicePath \"\"" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.933639 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2665ea2d-bc10-47be-b310-696c8e17c696" (UID: "2665ea2d-bc10-47be-b310-696c8e17c696"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.938561 4962 scope.go:117] "RemoveContainer" containerID="6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443" Oct 03 13:18:23 crc kubenswrapper[4962]: E1003 13:18:23.939124 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443\": container with ID starting with 6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443 not found: ID does not exist" containerID="6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.939172 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443"} err="failed to get container status \"6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443\": rpc error: code = NotFound desc = could not find container \"6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443\": container with ID starting with 6e1fb6cee660ca4f9f07206453f8126eafdcfd5181f036d208f6bad398f80443 not found: ID does not exist" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.939199 4962 scope.go:117] "RemoveContainer" containerID="2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa" Oct 03 13:18:23 crc kubenswrapper[4962]: E1003 13:18:23.939533 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa\": container with ID starting with 2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa not found: ID does not exist" containerID="2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.939557 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa"} err="failed to get container status \"2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa\": rpc error: code = NotFound desc = could not find container \"2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa\": container with ID starting with 2b8e04350c1cefe6576e912d9c6f81bec51c6d3c534fa12a7a9614e8519d54aa not found: ID does not exist" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.939572 4962 scope.go:117] "RemoveContainer" containerID="6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2" Oct 03 13:18:23 crc kubenswrapper[4962]: E1003 13:18:23.940019 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2\": container with ID starting with 6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2 not found: ID does not exist" containerID="6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2" Oct 03 13:18:23 crc kubenswrapper[4962]: I1003 13:18:23.940048 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2"} err="failed to get container status \"6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2\": rpc error: code = NotFound desc = could not find container \"6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2\": container with ID starting with 6609a416e24f6d47e33a6dec5fdbb449ee317c80771829ee0a05ca6efd2eb0e2 not found: ID does not exist" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.032990 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2665ea2d-bc10-47be-b310-696c8e17c696-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.143833 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cx8j9"] Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.149981 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cx8j9"] Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.228615 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.237687 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" path="/var/lib/kubelet/pods/2665ea2d-bc10-47be-b310-696c8e17c696/volumes" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.343489 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-catalog-content\") pod \"45e53a8c-30db-4179-aedd-591b2cae9c7a\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.343674 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-utilities\") pod \"45e53a8c-30db-4179-aedd-591b2cae9c7a\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.343745 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkll8\" (UniqueName: \"kubernetes.io/projected/45e53a8c-30db-4179-aedd-591b2cae9c7a-kube-api-access-xkll8\") pod \"45e53a8c-30db-4179-aedd-591b2cae9c7a\" (UID: \"45e53a8c-30db-4179-aedd-591b2cae9c7a\") " Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.344833 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-utilities" (OuterVolumeSpecName: "utilities") pod "45e53a8c-30db-4179-aedd-591b2cae9c7a" (UID: "45e53a8c-30db-4179-aedd-591b2cae9c7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.346623 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e53a8c-30db-4179-aedd-591b2cae9c7a-kube-api-access-xkll8" (OuterVolumeSpecName: "kube-api-access-xkll8") pod "45e53a8c-30db-4179-aedd-591b2cae9c7a" (UID: "45e53a8c-30db-4179-aedd-591b2cae9c7a"). InnerVolumeSpecName "kube-api-access-xkll8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.399781 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45e53a8c-30db-4179-aedd-591b2cae9c7a" (UID: "45e53a8c-30db-4179-aedd-591b2cae9c7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.445015 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkll8\" (UniqueName: \"kubernetes.io/projected/45e53a8c-30db-4179-aedd-591b2cae9c7a-kube-api-access-xkll8\") on node \"crc\" DevicePath \"\"" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.445043 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.445052 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e53a8c-30db-4179-aedd-591b2cae9c7a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.814848 4962 generic.go:334] "Generic (PLEG): container finished" podID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerID="d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67" exitCode=0 Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.814923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4g4s" event={"ID":"45e53a8c-30db-4179-aedd-591b2cae9c7a","Type":"ContainerDied","Data":"d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67"} Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.814933 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4g4s" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.814955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4g4s" event={"ID":"45e53a8c-30db-4179-aedd-591b2cae9c7a","Type":"ContainerDied","Data":"fbe2f380ff36450a34ccb17d8b9ffcd944172ecde73e1cfa8db325b308b92d1e"} Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.814974 4962 scope.go:117] "RemoveContainer" containerID="d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.837988 4962 scope.go:117] "RemoveContainer" containerID="82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.874729 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m4g4s"] Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.876836 4962 scope.go:117] "RemoveContainer" containerID="2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.881203 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m4g4s"] Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.910297 4962 scope.go:117] "RemoveContainer" containerID="d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67" Oct 03 13:18:24 crc kubenswrapper[4962]: E1003 13:18:24.910790 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67\": container with ID starting with d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67 not found: ID does not exist" containerID="d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.910829 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67"} err="failed to get container status \"d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67\": rpc error: code = NotFound desc = could not find container \"d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67\": container with ID starting with d2477506a974b4aea6941b02fbc4b5e3bc42d88bf4289a1df7c05c21f7c44d67 not found: ID does not exist" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.910850 4962 scope.go:117] "RemoveContainer" containerID="82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5" Oct 03 13:18:24 crc kubenswrapper[4962]: E1003 13:18:24.911247 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5\": container with ID starting with 82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5 not found: ID does not exist" containerID="82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.911290 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5"} err="failed to get container status \"82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5\": rpc error: code = NotFound desc = could not find container \"82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5\": container with ID starting with 82953cdd623fd4e1c877fe3e48e3bee0983043679c1c8df8fe183b9537c4daa5 not found: ID does not exist" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.911319 4962 scope.go:117] "RemoveContainer" containerID="2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436" Oct 03 13:18:24 crc kubenswrapper[4962]: E1003 13:18:24.911681 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436\": container with ID starting with 2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436 not found: ID does not exist" containerID="2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436" Oct 03 13:18:24 crc kubenswrapper[4962]: I1003 13:18:24.911720 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436"} err="failed to get container status \"2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436\": rpc error: code = NotFound desc = could not find container \"2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436\": container with ID starting with 2c5ac56801c3b1e4145db34055be6aef791619eab206124f83ec5acd3d2c7436 not found: ID does not exist" Oct 03 13:18:26 crc kubenswrapper[4962]: I1003 13:18:26.227284 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:18:26 crc kubenswrapper[4962]: E1003 13:18:26.228054 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:18:26 crc kubenswrapper[4962]: I1003 13:18:26.238374 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" path="/var/lib/kubelet/pods/45e53a8c-30db-4179-aedd-591b2cae9c7a/volumes" Oct 03 13:18:38 crc kubenswrapper[4962]: I1003 13:18:38.227456 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:18:38 crc kubenswrapper[4962]: E1003 13:18:38.228861 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:18:50 crc kubenswrapper[4962]: I1003 13:18:50.227889 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:18:50 crc kubenswrapper[4962]: E1003 13:18:50.228929 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:18:56 crc kubenswrapper[4962]: I1003 13:18:56.911534 4962 scope.go:117] "RemoveContainer" containerID="0eaaa81aa9272a755f71e6e8345983499f134718f443247c4ab7da637b161dee" Oct 03 13:18:56 crc kubenswrapper[4962]: I1003 13:18:56.934699 4962 scope.go:117] "RemoveContainer" containerID="858a9acb02475f1fa475e73f4f54f3c31ea0b2dd734ab2184ad2c2ef50d094b1" Oct 03 13:18:56 crc kubenswrapper[4962]: I1003 13:18:56.960720 4962 scope.go:117] "RemoveContainer" containerID="19508d7ab80c2b7b7b1fa204b25ec94f087e00b2aef79337e48cc4e04794ce38" Oct 03 13:18:56 crc kubenswrapper[4962]: I1003 13:18:56.992856 4962 scope.go:117] "RemoveContainer" containerID="747670b96bfa0a7f1ae526363ef27c6298cb058f6c1b63ac1f4c8ed4ccec39e3" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.009073 4962 scope.go:117] "RemoveContainer" containerID="77e7361ae6d4f822868e68592560c81c211040a7da222b118b3504c71f5b93c1" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.032985 4962 scope.go:117] "RemoveContainer" containerID="4c564ec0aa0ac250e494fc64a11e1161b3ca57f40622bafc69fe02e54a49d547" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.055873 4962 scope.go:117] "RemoveContainer" containerID="5fc9ed21237586b0dba4b83e5d63c7c19a71fd2d4afd7ab0b4d8c03e1f54bb41" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.078688 4962 scope.go:117] "RemoveContainer" containerID="153e678f1fc5f369dcef052e9b01146d16065e31e4717a4ecff24fb735e27ee3" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.102975 4962 scope.go:117] "RemoveContainer" containerID="42eff25118b1d9333ccb5e1e1095736fe7ad68812468237167cf44bfffda55aa" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.120819 4962 scope.go:117] "RemoveContainer" containerID="91f467d8010c6b11f362e5c8799a00178a2a8b03f7e9bf6ffe2a6502f8bb3a5c" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.136590 4962 scope.go:117] "RemoveContainer" containerID="184bb0414d89e05cc91628a858197e7943eee905467d226d4d1a357a8b8a1746" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.152167 4962 scope.go:117] "RemoveContainer" containerID="dd4e722d7156a1a3702c8352cc07fe685ce94d9af20fd5e713aa19eeae622754" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.172868 4962 scope.go:117] "RemoveContainer" containerID="9128b3882b18799c52b8901cee877983abb64473aa57b7378c6cd26e45351340" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.188218 4962 scope.go:117] "RemoveContainer" containerID="8642f3d9a3347e1307aee57c47fa850da1480a1bb21bb726d495551c0297fb08" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.209687 4962 scope.go:117] "RemoveContainer" containerID="80923de5c19f7d65b29bc15584394759409c24eacccb07a53c15f115d22dfe78" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.224185 4962 scope.go:117] "RemoveContainer" containerID="a74470166961669f8dea3b77b512b7b3a308a86536fe0161bd350804b30907c1" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.237498 4962 scope.go:117] "RemoveContainer" containerID="dfa93acbbb91865d0cb4ff792e49c1fab6703cc34cf33248fb44323393847688" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.263578 4962 scope.go:117] "RemoveContainer" containerID="a12b255b4149f607b5d977552bdbebdcd66a9c203b6d443741a1d7ceceff6ad5" Oct 03 13:18:57 crc kubenswrapper[4962]: I1003 13:18:57.277314 4962 scope.go:117] "RemoveContainer" containerID="a7cacdfd728f00a14188df7d8385721ed6372f29aee704ba66954115ef4f7644" Oct 03 13:19:04 crc kubenswrapper[4962]: I1003 13:19:04.227902 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:19:04 crc kubenswrapper[4962]: E1003 13:19:04.228617 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:19:16 crc kubenswrapper[4962]: I1003 13:19:16.227481 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:19:16 crc kubenswrapper[4962]: E1003 13:19:16.228424 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:19:28 crc kubenswrapper[4962]: I1003 13:19:28.227189 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:19:28 crc kubenswrapper[4962]: E1003 13:19:28.228937 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:19:41 crc kubenswrapper[4962]: I1003 13:19:41.227955 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:19:41 crc kubenswrapper[4962]: E1003 13:19:41.229335 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:19:54 crc kubenswrapper[4962]: I1003 13:19:54.227278 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:19:54 crc kubenswrapper[4962]: E1003 13:19:54.228366 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.565350 4962 scope.go:117] "RemoveContainer" containerID="5338fa6e89e4bd15f9e4db940f00a7a5b8fdf7c12ab81b1de2fe6747f81ea20d" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.592921 4962 scope.go:117] "RemoveContainer" containerID="46e6da82e57b34e2bc7a9d03f73558235f6511a392ce8f5cec53b37074e329af" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.633005 4962 scope.go:117] "RemoveContainer" containerID="31ec554c86926fa60a6d1b72601dfc4ca004a83f21bd45f79846d05688388ebf" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.651940 4962 scope.go:117] "RemoveContainer" containerID="0ef7a555af8ba3763ace041261009ed654126cd2f78762f2ab1c9bc4bf7072db" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.692281 4962 scope.go:117] "RemoveContainer" containerID="0f13089470a687f3441879430ff97fb0147186cfadd8f1e2e6b1b3fdfab1347e" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.708441 4962 scope.go:117] "RemoveContainer" containerID="1ffd9ed0756445b7f118da7e647ddacf67d26cbedb3b89a2b3074c6bedfe80b2" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.731057 4962 scope.go:117] "RemoveContainer" containerID="b92f6a632cc9c0dff9f450965eee31724d28f079d91c0b8852b080e2ed919e29" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.747926 4962 scope.go:117] "RemoveContainer" containerID="036194e68dbb515945afa3ad089ad8f4474610c770e29c2e3ac03647eae66d7d" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.761507 4962 scope.go:117] "RemoveContainer" containerID="2fe1fb6596fd4bab23e8f5b8fffa9f204b8288c3c24b70ea1583257b39048287" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.773819 4962 scope.go:117] "RemoveContainer" containerID="cd77f692ad16b6df80e87bb64d27bf54a83e55dc2e95cca7ac8bfb5c3f2c63a2" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.787599 4962 scope.go:117] "RemoveContainer" containerID="fde2842fbb361eeef96c1f37f5ca7accd441dd0d4530fb4fc8d6c4be4392db6f" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.810393 4962 scope.go:117] "RemoveContainer" containerID="86b2ecabe6ed78973d278885e95f62a78704e8d2b70f094254e52932b6e6c618" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.828000 4962 scope.go:117] "RemoveContainer" containerID="efdd11f8bd8386aa2fc051d59f9344ed094988bb97638532765b2b52ec56a7ba" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.843172 4962 scope.go:117] "RemoveContainer" containerID="cb23b1576afecd7bb943264b193f4118ca9e14e0fda87415eabb3843eaf944f9" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.858320 4962 scope.go:117] "RemoveContainer" containerID="bfa078c92cf50627cb21a7528cdd93759094f41cfe07bc834b04b7d668a8b374" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.891724 4962 scope.go:117] "RemoveContainer" containerID="d17ebef49d6db3405f46c76620af3c4ad52867905601777304861ab2ba50d3a2" Oct 03 13:19:57 crc kubenswrapper[4962]: I1003 13:19:57.913509 4962 scope.go:117] "RemoveContainer" containerID="e9b4cb84ef4c21a8595bf182936df461ef5cc7e4bb630f5cdc4490a12d404462" Oct 03 13:20:07 crc kubenswrapper[4962]: I1003 13:20:07.226851 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:20:07 crc kubenswrapper[4962]: I1003 13:20:07.782527 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"202ec3a5bbee3d78171eec8c7e876813cc91dbe7975b20c5f5f7d2c2f4ad92e2"} Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.066622 4962 scope.go:117] "RemoveContainer" containerID="0135cfd571ce2874c84e154f88d2566b922232ee0a4046bc559e2dcc8e3cb67b" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.112972 4962 scope.go:117] "RemoveContainer" containerID="7377486e2b45b77a0115044df6ada84243926afe00f2d9d4d7481b7f35b3b3cd" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.141086 4962 scope.go:117] "RemoveContainer" containerID="65b272d77ba32c51dd9d80e7600ed36904dbcfecda99b28940a7c118dbf10fee" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.175962 4962 scope.go:117] "RemoveContainer" containerID="ac73c42dc924c54bc8c349e88f72de0cd595955349b0de465efb0b5629a1c596" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.193967 4962 scope.go:117] "RemoveContainer" containerID="bba8dbd73d396cb05609d66b70ee7132d3c6fc727cd79d137e8dcac91a81f9d5" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.211726 4962 scope.go:117] "RemoveContainer" containerID="580d989f2c0e944c9c1d2f5e5caf3bb4c1a2c9b700143ae9cbb8802fec699c5c" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.248618 4962 scope.go:117] "RemoveContainer" containerID="c5177f0f305f7d8efd50064ca1ae9320ecca80819662d7f114a735ed39509584" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.272508 4962 scope.go:117] "RemoveContainer" containerID="26ea37844a3fdbef6bd439665981583a3e5b323f9b6bca65d529332f1bdd10fe" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.289704 4962 scope.go:117] "RemoveContainer" containerID="b27da2f01290ecf61072efad87218e000a6819ad0aac516d4d56189f22787d6c" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.327694 4962 scope.go:117] "RemoveContainer" containerID="289ae55c3810fc3fdbc0b8d2aaf50cec4df133730555ad763e84eaf124304306" Oct 03 13:20:58 crc kubenswrapper[4962]: I1003 13:20:58.367692 4962 scope.go:117] "RemoveContainer" containerID="c9c8ed01d13ca0f8ff902a9439408e51baa90fc268f710fa26c3c09fc4aeec3c" Oct 03 13:21:58 crc kubenswrapper[4962]: I1003 13:21:58.487412 4962 scope.go:117] "RemoveContainer" containerID="3e39d9d9f9ef98752b8331bca21aeea43f0d1658741d17c5c7ad0d95aa684075" Oct 03 13:21:58 crc kubenswrapper[4962]: I1003 13:21:58.508262 4962 scope.go:117] "RemoveContainer" containerID="05b9f6f45511688aab50c27ba2558d43869979d7395ad2e599ba9c06301661e8" Oct 03 13:21:58 crc kubenswrapper[4962]: I1003 13:21:58.524179 4962 scope.go:117] "RemoveContainer" containerID="d52557ff30e196f1185add245797713d5c9f8bef9d3167e6d50a37017d0126f3" Oct 03 13:21:58 crc kubenswrapper[4962]: I1003 13:21:58.540498 4962 scope.go:117] "RemoveContainer" containerID="0ce8b4a4ac9d0b8cc21e8feaa51be48f8d9e21fc13fb4c98b22efe982cb9565b" Oct 03 13:21:58 crc kubenswrapper[4962]: I1003 13:21:58.559329 4962 scope.go:117] "RemoveContainer" containerID="78e474fd6d91ce286900fc2f5b10cfa7e118e740653453d45c91f5fde6337be2" Oct 03 13:21:58 crc kubenswrapper[4962]: I1003 13:21:58.599519 4962 scope.go:117] "RemoveContainer" containerID="ae65644f1732d90a709b491cf3a8a7c7ca3acd9268609750aa5824cea960f345" Oct 03 13:21:58 crc kubenswrapper[4962]: I1003 13:21:58.616950 4962 scope.go:117] "RemoveContainer" containerID="46f6912f8f25e900b01e11da327d2482ee9d18c20a5ec8e5af15a90619d45612" Oct 03 13:21:58 crc kubenswrapper[4962]: I1003 13:21:58.631713 4962 scope.go:117] "RemoveContainer" containerID="024390e769ef5aec351b0a25e277cb3becb27c08b4c0917b93bfe0e1e2975164" Oct 03 13:22:24 crc kubenswrapper[4962]: I1003 13:22:24.660512 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:22:24 crc kubenswrapper[4962]: I1003 13:22:24.661209 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:22:54 crc kubenswrapper[4962]: I1003 13:22:54.660447 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:22:54 crc kubenswrapper[4962]: I1003 13:22:54.660970 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:22:58 crc kubenswrapper[4962]: I1003 13:22:58.725816 4962 scope.go:117] "RemoveContainer" containerID="c5d4d124b101cf04b221f4e6263424d582db30137ac6b55d395398aca8b14c29" Oct 03 13:22:58 crc kubenswrapper[4962]: I1003 13:22:58.749159 4962 scope.go:117] "RemoveContainer" containerID="31aad6feef087c421df422ff28c19e306f421979c28e952513ba254522aac910" Oct 03 13:22:58 crc kubenswrapper[4962]: I1003 13:22:58.809068 4962 scope.go:117] "RemoveContainer" containerID="3c734b51211f1947d4f13bde1f8348ca138c6ca1d3cd8d9c8095f5f48ccbdbd0" Oct 03 13:22:58 crc kubenswrapper[4962]: I1003 13:22:58.851085 4962 scope.go:117] "RemoveContainer" containerID="2240cfada8ca479d27ece7a6b38dc1fd2113be8f62a10285d1e6b5342298ed72" Oct 03 13:22:58 crc kubenswrapper[4962]: I1003 13:22:58.873137 4962 scope.go:117] "RemoveContainer" containerID="cbf328789db561021439da2a08d5d4001980c89f6b0a094a5fc6f703782ccc9f" Oct 03 13:23:24 crc kubenswrapper[4962]: I1003 13:23:24.659966 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:23:24 crc kubenswrapper[4962]: I1003 13:23:24.661298 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:23:24 crc kubenswrapper[4962]: I1003 13:23:24.661391 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:23:24 crc kubenswrapper[4962]: I1003 13:23:24.662787 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"202ec3a5bbee3d78171eec8c7e876813cc91dbe7975b20c5f5f7d2c2f4ad92e2"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:23:24 crc kubenswrapper[4962]: I1003 13:23:24.662895 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://202ec3a5bbee3d78171eec8c7e876813cc91dbe7975b20c5f5f7d2c2f4ad92e2" gracePeriod=600 Oct 03 13:23:25 crc kubenswrapper[4962]: I1003 13:23:25.346553 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="202ec3a5bbee3d78171eec8c7e876813cc91dbe7975b20c5f5f7d2c2f4ad92e2" exitCode=0 Oct 03 13:23:25 crc kubenswrapper[4962]: I1003 13:23:25.346617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"202ec3a5bbee3d78171eec8c7e876813cc91dbe7975b20c5f5f7d2c2f4ad92e2"} Oct 03 13:23:25 crc kubenswrapper[4962]: I1003 13:23:25.347048 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3"} Oct 03 13:23:25 crc kubenswrapper[4962]: I1003 13:23:25.347094 4962 scope.go:117] "RemoveContainer" containerID="a5bec38aec5fb9d5911cd2ecc57792eefe10d43cef3b74d0517cf307d832bc5c" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.792267 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dkc"] Oct 03 13:23:54 crc kubenswrapper[4962]: E1003 13:23:54.793081 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerName="registry-server" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.793093 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerName="registry-server" Oct 03 13:23:54 crc kubenswrapper[4962]: E1003 13:23:54.793104 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerName="extract-utilities" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.793111 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerName="extract-utilities" Oct 03 13:23:54 crc kubenswrapper[4962]: E1003 13:23:54.793127 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" containerName="registry-server" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.793135 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" containerName="registry-server" Oct 03 13:23:54 crc kubenswrapper[4962]: E1003 13:23:54.793145 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" containerName="extract-content" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.793150 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" containerName="extract-content" Oct 03 13:23:54 crc kubenswrapper[4962]: E1003 13:23:54.793163 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerName="extract-content" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.793168 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerName="extract-content" Oct 03 13:23:54 crc kubenswrapper[4962]: E1003 13:23:54.793183 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" containerName="extract-utilities" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.793188 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" containerName="extract-utilities" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.793318 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2665ea2d-bc10-47be-b310-696c8e17c696" containerName="registry-server" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.793330 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e53a8c-30db-4179-aedd-591b2cae9c7a" containerName="registry-server" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.794233 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.807009 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dkc"] Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.991064 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zp22\" (UniqueName: \"kubernetes.io/projected/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-kube-api-access-9zp22\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.991127 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-catalog-content\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:54 crc kubenswrapper[4962]: I1003 13:23:54.991189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-utilities\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.092817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-utilities\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.092912 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp22\" (UniqueName: \"kubernetes.io/projected/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-kube-api-access-9zp22\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.092939 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-catalog-content\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.093454 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-utilities\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.093481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-catalog-content\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.120136 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zp22\" (UniqueName: \"kubernetes.io/projected/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-kube-api-access-9zp22\") pod \"redhat-marketplace-r2dkc\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.169651 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.391773 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dkc"] Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.614539 4962 generic.go:334] "Generic (PLEG): container finished" podID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerID="d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe" exitCode=0 Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.614590 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dkc" event={"ID":"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b","Type":"ContainerDied","Data":"d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe"} Oct 03 13:23:55 crc kubenswrapper[4962]: I1003 13:23:55.614618 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dkc" event={"ID":"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b","Type":"ContainerStarted","Data":"fd8e5e2842557fa3d081c35a19588f0168b547cffeddf592b97333058b05e658"} Oct 03 13:23:56 crc kubenswrapper[4962]: I1003 13:23:56.622739 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:23:57 crc kubenswrapper[4962]: I1003 13:23:57.633049 4962 generic.go:334] "Generic (PLEG): container finished" podID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerID="abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35" exitCode=0 Oct 03 13:23:57 crc kubenswrapper[4962]: I1003 13:23:57.633137 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dkc" event={"ID":"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b","Type":"ContainerDied","Data":"abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35"} Oct 03 13:23:58 crc kubenswrapper[4962]: I1003 13:23:58.642353 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dkc" event={"ID":"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b","Type":"ContainerStarted","Data":"3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4"} Oct 03 13:23:58 crc kubenswrapper[4962]: I1003 13:23:58.671731 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r2dkc" podStartSLOduration=3.2022778929999998 podStartE2EDuration="4.67171439s" podCreationTimestamp="2025-10-03 13:23:54 +0000 UTC" firstStartedPulling="2025-10-03 13:23:56.622494475 +0000 UTC m=+2045.026392310" lastFinishedPulling="2025-10-03 13:23:58.091930972 +0000 UTC m=+2046.495828807" observedRunningTime="2025-10-03 13:23:58.670836557 +0000 UTC m=+2047.074734382" watchObservedRunningTime="2025-10-03 13:23:58.67171439 +0000 UTC m=+2047.075612225" Oct 03 13:24:05 crc kubenswrapper[4962]: I1003 13:24:05.170689 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:24:05 crc kubenswrapper[4962]: I1003 13:24:05.171425 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:24:05 crc kubenswrapper[4962]: I1003 13:24:05.230501 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:24:05 crc kubenswrapper[4962]: I1003 13:24:05.780491 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:24:05 crc kubenswrapper[4962]: I1003 13:24:05.835078 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dkc"] Oct 03 13:24:07 crc kubenswrapper[4962]: I1003 13:24:07.729556 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r2dkc" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerName="registry-server" containerID="cri-o://3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4" gracePeriod=2 Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.160222 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.317497 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zp22\" (UniqueName: \"kubernetes.io/projected/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-kube-api-access-9zp22\") pod \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.317699 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-catalog-content\") pod \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.317761 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-utilities\") pod \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\" (UID: \"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b\") " Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.320009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-utilities" (OuterVolumeSpecName: "utilities") pod "37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" (UID: "37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.324834 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-kube-api-access-9zp22" (OuterVolumeSpecName: "kube-api-access-9zp22") pod "37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" (UID: "37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b"). InnerVolumeSpecName "kube-api-access-9zp22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.333952 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" (UID: "37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.420792 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zp22\" (UniqueName: \"kubernetes.io/projected/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-kube-api-access-9zp22\") on node \"crc\" DevicePath \"\"" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.420833 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.420843 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.740574 4962 generic.go:334] "Generic (PLEG): container finished" podID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerID="3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4" exitCode=0 Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.740621 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dkc" event={"ID":"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b","Type":"ContainerDied","Data":"3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4"} Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.740654 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2dkc" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.740677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dkc" event={"ID":"37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b","Type":"ContainerDied","Data":"fd8e5e2842557fa3d081c35a19588f0168b547cffeddf592b97333058b05e658"} Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.740702 4962 scope.go:117] "RemoveContainer" containerID="3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.763571 4962 scope.go:117] "RemoveContainer" containerID="abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.778392 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dkc"] Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.786456 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dkc"] Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.802720 4962 scope.go:117] "RemoveContainer" containerID="d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.816682 4962 scope.go:117] "RemoveContainer" containerID="3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4" Oct 03 13:24:08 crc kubenswrapper[4962]: E1003 13:24:08.817115 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4\": container with ID starting with 3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4 not found: ID does not exist" containerID="3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.817166 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4"} err="failed to get container status \"3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4\": rpc error: code = NotFound desc = could not find container \"3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4\": container with ID starting with 3bfdd0c179c43f178764b0359208d21b150877e6098b3db5d7d0c69403f369e4 not found: ID does not exist" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.817200 4962 scope.go:117] "RemoveContainer" containerID="abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35" Oct 03 13:24:08 crc kubenswrapper[4962]: E1003 13:24:08.817433 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35\": container with ID starting with abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35 not found: ID does not exist" containerID="abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.817471 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35"} err="failed to get container status \"abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35\": rpc error: code = NotFound desc = could not find container \"abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35\": container with ID starting with abb752e7d3c4e746c8add5cd65c3db3fb08c7a48ad8ef39851bbec02fbd9fb35 not found: ID does not exist" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.817495 4962 scope.go:117] "RemoveContainer" containerID="d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe" Oct 03 13:24:08 crc kubenswrapper[4962]: E1003 13:24:08.818439 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe\": container with ID starting with d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe not found: ID does not exist" containerID="d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe" Oct 03 13:24:08 crc kubenswrapper[4962]: I1003 13:24:08.818485 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe"} err="failed to get container status \"d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe\": rpc error: code = NotFound desc = could not find container \"d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe\": container with ID starting with d903e7d242246e68f9d9b26b07a83bebf19d24ab00bab49a311027ea7176d7fe not found: ID does not exist" Oct 03 13:24:10 crc kubenswrapper[4962]: I1003 13:24:10.235302 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" path="/var/lib/kubelet/pods/37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b/volumes" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.520102 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5x4gw"] Oct 03 13:25:19 crc kubenswrapper[4962]: E1003 13:25:19.521026 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerName="registry-server" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.521057 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerName="registry-server" Oct 03 13:25:19 crc kubenswrapper[4962]: E1003 13:25:19.521086 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerName="extract-utilities" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.521095 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerName="extract-utilities" Oct 03 13:25:19 crc kubenswrapper[4962]: E1003 13:25:19.521125 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerName="extract-content" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.521132 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerName="extract-content" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.521322 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e72cde-fdbf-4b41-94aa-f7fb4da6ac6b" containerName="registry-server" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.522392 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.539390 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5x4gw"] Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.613623 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-catalog-content\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.613693 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-utilities\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.613955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2dtb\" (UniqueName: \"kubernetes.io/projected/a5c28357-a410-4b0b-97d0-c93c91605bbf-kube-api-access-g2dtb\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.715879 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-catalog-content\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.715942 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-utilities\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.716013 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2dtb\" (UniqueName: \"kubernetes.io/projected/a5c28357-a410-4b0b-97d0-c93c91605bbf-kube-api-access-g2dtb\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.716502 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-utilities\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.716514 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-catalog-content\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.741132 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2dtb\" (UniqueName: \"kubernetes.io/projected/a5c28357-a410-4b0b-97d0-c93c91605bbf-kube-api-access-g2dtb\") pod \"community-operators-5x4gw\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:19 crc kubenswrapper[4962]: I1003 13:25:19.854627 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:20 crc kubenswrapper[4962]: I1003 13:25:20.371172 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5x4gw"] Oct 03 13:25:21 crc kubenswrapper[4962]: I1003 13:25:21.261753 4962 generic.go:334] "Generic (PLEG): container finished" podID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerID="5defaf3f1094fc059ff340c199e3021058b624b4855ec1b532b1657ca60ea28a" exitCode=0 Oct 03 13:25:21 crc kubenswrapper[4962]: I1003 13:25:21.261851 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4gw" event={"ID":"a5c28357-a410-4b0b-97d0-c93c91605bbf","Type":"ContainerDied","Data":"5defaf3f1094fc059ff340c199e3021058b624b4855ec1b532b1657ca60ea28a"} Oct 03 13:25:21 crc kubenswrapper[4962]: I1003 13:25:21.262114 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4gw" event={"ID":"a5c28357-a410-4b0b-97d0-c93c91605bbf","Type":"ContainerStarted","Data":"456cbc6bbb50ca670524025489d269824202b7cd13d767c524d4ef2a52d2134a"} Oct 03 13:25:22 crc kubenswrapper[4962]: I1003 13:25:22.272242 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4gw" event={"ID":"a5c28357-a410-4b0b-97d0-c93c91605bbf","Type":"ContainerStarted","Data":"64ee9cfe62a27f42499f16eb2220f56ee158f0fdaaccf7edb78542dbbff76843"} Oct 03 13:25:23 crc kubenswrapper[4962]: I1003 13:25:23.282703 4962 generic.go:334] "Generic (PLEG): container finished" podID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerID="64ee9cfe62a27f42499f16eb2220f56ee158f0fdaaccf7edb78542dbbff76843" exitCode=0 Oct 03 13:25:23 crc kubenswrapper[4962]: I1003 13:25:23.282776 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4gw" event={"ID":"a5c28357-a410-4b0b-97d0-c93c91605bbf","Type":"ContainerDied","Data":"64ee9cfe62a27f42499f16eb2220f56ee158f0fdaaccf7edb78542dbbff76843"} Oct 03 13:25:24 crc kubenswrapper[4962]: I1003 13:25:24.291606 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4gw" event={"ID":"a5c28357-a410-4b0b-97d0-c93c91605bbf","Type":"ContainerStarted","Data":"f300d881e56fe58382dd259f46f5580db864b608b15deab74f6e7cff8b07784d"} Oct 03 13:25:24 crc kubenswrapper[4962]: I1003 13:25:24.335533 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5x4gw" podStartSLOduration=2.808798671 podStartE2EDuration="5.335513704s" podCreationTimestamp="2025-10-03 13:25:19 +0000 UTC" firstStartedPulling="2025-10-03 13:25:21.263737547 +0000 UTC m=+2129.667635382" lastFinishedPulling="2025-10-03 13:25:23.79045255 +0000 UTC m=+2132.194350415" observedRunningTime="2025-10-03 13:25:24.330704975 +0000 UTC m=+2132.734602850" watchObservedRunningTime="2025-10-03 13:25:24.335513704 +0000 UTC m=+2132.739411549" Oct 03 13:25:29 crc kubenswrapper[4962]: I1003 13:25:29.855874 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:29 crc kubenswrapper[4962]: I1003 13:25:29.856439 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:29 crc kubenswrapper[4962]: I1003 13:25:29.893890 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:30 crc kubenswrapper[4962]: I1003 13:25:30.433171 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:30 crc kubenswrapper[4962]: I1003 13:25:30.480412 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5x4gw"] Oct 03 13:25:32 crc kubenswrapper[4962]: I1003 13:25:32.395065 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5x4gw" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerName="registry-server" containerID="cri-o://f300d881e56fe58382dd259f46f5580db864b608b15deab74f6e7cff8b07784d" gracePeriod=2 Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.407027 4962 generic.go:334] "Generic (PLEG): container finished" podID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerID="f300d881e56fe58382dd259f46f5580db864b608b15deab74f6e7cff8b07784d" exitCode=0 Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.407082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4gw" event={"ID":"a5c28357-a410-4b0b-97d0-c93c91605bbf","Type":"ContainerDied","Data":"f300d881e56fe58382dd259f46f5580db864b608b15deab74f6e7cff8b07784d"} Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.407722 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4gw" event={"ID":"a5c28357-a410-4b0b-97d0-c93c91605bbf","Type":"ContainerDied","Data":"456cbc6bbb50ca670524025489d269824202b7cd13d767c524d4ef2a52d2134a"} Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.407745 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="456cbc6bbb50ca670524025489d269824202b7cd13d767c524d4ef2a52d2134a" Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.441654 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.538342 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-utilities\") pod \"a5c28357-a410-4b0b-97d0-c93c91605bbf\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.538549 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2dtb\" (UniqueName: \"kubernetes.io/projected/a5c28357-a410-4b0b-97d0-c93c91605bbf-kube-api-access-g2dtb\") pod \"a5c28357-a410-4b0b-97d0-c93c91605bbf\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.538579 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-catalog-content\") pod \"a5c28357-a410-4b0b-97d0-c93c91605bbf\" (UID: \"a5c28357-a410-4b0b-97d0-c93c91605bbf\") " Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.539623 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-utilities" (OuterVolumeSpecName: "utilities") pod "a5c28357-a410-4b0b-97d0-c93c91605bbf" (UID: "a5c28357-a410-4b0b-97d0-c93c91605bbf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.543854 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c28357-a410-4b0b-97d0-c93c91605bbf-kube-api-access-g2dtb" (OuterVolumeSpecName: "kube-api-access-g2dtb") pod "a5c28357-a410-4b0b-97d0-c93c91605bbf" (UID: "a5c28357-a410-4b0b-97d0-c93c91605bbf"). InnerVolumeSpecName "kube-api-access-g2dtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.585976 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5c28357-a410-4b0b-97d0-c93c91605bbf" (UID: "a5c28357-a410-4b0b-97d0-c93c91605bbf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.640356 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2dtb\" (UniqueName: \"kubernetes.io/projected/a5c28357-a410-4b0b-97d0-c93c91605bbf-kube-api-access-g2dtb\") on node \"crc\" DevicePath \"\"" Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.640393 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:25:33 crc kubenswrapper[4962]: I1003 13:25:33.640405 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c28357-a410-4b0b-97d0-c93c91605bbf-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:25:34 crc kubenswrapper[4962]: I1003 13:25:34.414103 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x4gw" Oct 03 13:25:34 crc kubenswrapper[4962]: I1003 13:25:34.438406 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5x4gw"] Oct 03 13:25:34 crc kubenswrapper[4962]: I1003 13:25:34.444839 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5x4gw"] Oct 03 13:25:36 crc kubenswrapper[4962]: I1003 13:25:36.235253 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" path="/var/lib/kubelet/pods/a5c28357-a410-4b0b-97d0-c93c91605bbf/volumes" Oct 03 13:25:54 crc kubenswrapper[4962]: I1003 13:25:54.662988 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:25:54 crc kubenswrapper[4962]: I1003 13:25:54.663727 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:26:24 crc kubenswrapper[4962]: I1003 13:26:24.660304 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:26:24 crc kubenswrapper[4962]: I1003 13:26:24.660974 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:26:54 crc kubenswrapper[4962]: I1003 13:26:54.659915 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:26:54 crc kubenswrapper[4962]: I1003 13:26:54.660577 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:26:54 crc kubenswrapper[4962]: I1003 13:26:54.660662 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:26:54 crc kubenswrapper[4962]: I1003 13:26:54.661467 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:26:54 crc kubenswrapper[4962]: I1003 13:26:54.661552 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" gracePeriod=600 Oct 03 13:26:54 crc kubenswrapper[4962]: E1003 13:26:54.795560 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:26:55 crc kubenswrapper[4962]: I1003 13:26:55.051627 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" exitCode=0 Oct 03 13:26:55 crc kubenswrapper[4962]: I1003 13:26:55.051702 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3"} Oct 03 13:26:55 crc kubenswrapper[4962]: I1003 13:26:55.051742 4962 scope.go:117] "RemoveContainer" containerID="202ec3a5bbee3d78171eec8c7e876813cc91dbe7975b20c5f5f7d2c2f4ad92e2" Oct 03 13:26:55 crc kubenswrapper[4962]: I1003 13:26:55.052291 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:26:55 crc kubenswrapper[4962]: E1003 13:26:55.052567 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:27:06 crc kubenswrapper[4962]: I1003 13:27:06.227125 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:27:06 crc kubenswrapper[4962]: E1003 13:27:06.227907 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:27:18 crc kubenswrapper[4962]: I1003 13:27:18.226947 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:27:18 crc kubenswrapper[4962]: E1003 13:27:18.228034 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:27:32 crc kubenswrapper[4962]: I1003 13:27:32.232661 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:27:32 crc kubenswrapper[4962]: E1003 13:27:32.233713 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:27:46 crc kubenswrapper[4962]: I1003 13:27:46.481994 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:27:46 crc kubenswrapper[4962]: E1003 13:27:46.488986 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:27:59 crc kubenswrapper[4962]: I1003 13:27:59.227451 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:27:59 crc kubenswrapper[4962]: E1003 13:27:59.228167 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:28:12 crc kubenswrapper[4962]: I1003 13:28:12.233146 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:28:12 crc kubenswrapper[4962]: E1003 13:28:12.234048 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:28:26 crc kubenswrapper[4962]: I1003 13:28:26.228865 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:28:26 crc kubenswrapper[4962]: E1003 13:28:26.229797 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:28:41 crc kubenswrapper[4962]: I1003 13:28:41.227612 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:28:41 crc kubenswrapper[4962]: E1003 13:28:41.228415 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:28:52 crc kubenswrapper[4962]: I1003 13:28:52.236847 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:28:52 crc kubenswrapper[4962]: E1003 13:28:52.238537 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:28:54 crc kubenswrapper[4962]: I1003 13:28:54.943081 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g2wfj"] Oct 03 13:28:54 crc kubenswrapper[4962]: E1003 13:28:54.943985 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerName="extract-utilities" Oct 03 13:28:54 crc kubenswrapper[4962]: I1003 13:28:54.944011 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerName="extract-utilities" Oct 03 13:28:54 crc kubenswrapper[4962]: E1003 13:28:54.944030 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerName="extract-content" Oct 03 13:28:54 crc kubenswrapper[4962]: I1003 13:28:54.944043 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerName="extract-content" Oct 03 13:28:54 crc kubenswrapper[4962]: E1003 13:28:54.944099 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerName="registry-server" Oct 03 13:28:54 crc kubenswrapper[4962]: I1003 13:28:54.944113 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerName="registry-server" Oct 03 13:28:54 crc kubenswrapper[4962]: I1003 13:28:54.944392 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c28357-a410-4b0b-97d0-c93c91605bbf" containerName="registry-server" Oct 03 13:28:54 crc kubenswrapper[4962]: I1003 13:28:54.946201 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:54 crc kubenswrapper[4962]: I1003 13:28:54.980484 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2wfj"] Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.058886 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njbpr\" (UniqueName: \"kubernetes.io/projected/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-kube-api-access-njbpr\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.058959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-catalog-content\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.058995 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-utilities\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.161097 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njbpr\" (UniqueName: \"kubernetes.io/projected/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-kube-api-access-njbpr\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.161206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-catalog-content\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.161258 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-utilities\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.161765 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-catalog-content\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.162030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-utilities\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.191325 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njbpr\" (UniqueName: \"kubernetes.io/projected/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-kube-api-access-njbpr\") pod \"redhat-operators-g2wfj\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.282683 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:28:55 crc kubenswrapper[4962]: I1003 13:28:55.532769 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2wfj"] Oct 03 13:28:56 crc kubenswrapper[4962]: I1003 13:28:56.076291 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2wfj" event={"ID":"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab","Type":"ContainerStarted","Data":"12f0184a0ad8415649b525733911b2e88866fd1218ecfd5ab18d60ffd434c5be"} Oct 03 13:28:57 crc kubenswrapper[4962]: I1003 13:28:57.086325 4962 generic.go:334] "Generic (PLEG): container finished" podID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerID="30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a" exitCode=0 Oct 03 13:28:57 crc kubenswrapper[4962]: I1003 13:28:57.086591 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2wfj" event={"ID":"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab","Type":"ContainerDied","Data":"30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a"} Oct 03 13:28:57 crc kubenswrapper[4962]: I1003 13:28:57.090077 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:29:01 crc kubenswrapper[4962]: I1003 13:29:01.125115 4962 generic.go:334] "Generic (PLEG): container finished" podID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerID="4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c" exitCode=0 Oct 03 13:29:01 crc kubenswrapper[4962]: I1003 13:29:01.125222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2wfj" event={"ID":"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab","Type":"ContainerDied","Data":"4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c"} Oct 03 13:29:03 crc kubenswrapper[4962]: I1003 13:29:03.139782 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2wfj" event={"ID":"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab","Type":"ContainerStarted","Data":"de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d"} Oct 03 13:29:03 crc kubenswrapper[4962]: I1003 13:29:03.173072 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g2wfj" podStartSLOduration=4.188921453 podStartE2EDuration="9.1730498s" podCreationTimestamp="2025-10-03 13:28:54 +0000 UTC" firstStartedPulling="2025-10-03 13:28:57.089790883 +0000 UTC m=+2345.493688728" lastFinishedPulling="2025-10-03 13:29:02.07391923 +0000 UTC m=+2350.477817075" observedRunningTime="2025-10-03 13:29:03.16784558 +0000 UTC m=+2351.571743435" watchObservedRunningTime="2025-10-03 13:29:03.1730498 +0000 UTC m=+2351.576947635" Oct 03 13:29:03 crc kubenswrapper[4962]: I1003 13:29:03.226775 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:29:03 crc kubenswrapper[4962]: E1003 13:29:03.227338 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:29:05 crc kubenswrapper[4962]: I1003 13:29:05.282981 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:29:05 crc kubenswrapper[4962]: I1003 13:29:05.283242 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:29:06 crc kubenswrapper[4962]: I1003 13:29:06.327542 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g2wfj" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="registry-server" probeResult="failure" output=< Oct 03 13:29:06 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Oct 03 13:29:06 crc kubenswrapper[4962]: > Oct 03 13:29:14 crc kubenswrapper[4962]: I1003 13:29:14.227018 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:29:14 crc kubenswrapper[4962]: E1003 13:29:14.227917 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:29:15 crc kubenswrapper[4962]: I1003 13:29:15.331296 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:29:15 crc kubenswrapper[4962]: I1003 13:29:15.384837 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:29:15 crc kubenswrapper[4962]: I1003 13:29:15.565112 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2wfj"] Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.246973 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g2wfj" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="registry-server" containerID="cri-o://de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d" gracePeriod=2 Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.681854 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.818269 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-utilities\") pod \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.818324 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njbpr\" (UniqueName: \"kubernetes.io/projected/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-kube-api-access-njbpr\") pod \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.818394 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-catalog-content\") pod \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\" (UID: \"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab\") " Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.819340 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-utilities" (OuterVolumeSpecName: "utilities") pod "e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" (UID: "e225514b-6f35-4a1d-8ca5-4dc0f3d13aab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.826230 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-kube-api-access-njbpr" (OuterVolumeSpecName: "kube-api-access-njbpr") pod "e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" (UID: "e225514b-6f35-4a1d-8ca5-4dc0f3d13aab"). InnerVolumeSpecName "kube-api-access-njbpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.916218 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" (UID: "e225514b-6f35-4a1d-8ca5-4dc0f3d13aab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.920326 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njbpr\" (UniqueName: \"kubernetes.io/projected/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-kube-api-access-njbpr\") on node \"crc\" DevicePath \"\"" Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.920366 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:29:17 crc kubenswrapper[4962]: I1003 13:29:17.920378 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.258780 4962 generic.go:334] "Generic (PLEG): container finished" podID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerID="de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d" exitCode=0 Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.258844 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2wfj" event={"ID":"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab","Type":"ContainerDied","Data":"de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d"} Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.258874 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2wfj" event={"ID":"e225514b-6f35-4a1d-8ca5-4dc0f3d13aab","Type":"ContainerDied","Data":"12f0184a0ad8415649b525733911b2e88866fd1218ecfd5ab18d60ffd434c5be"} Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.258892 4962 scope.go:117] "RemoveContainer" containerID="de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.259056 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2wfj" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.286393 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2wfj"] Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.294366 4962 scope.go:117] "RemoveContainer" containerID="4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.295094 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g2wfj"] Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.319332 4962 scope.go:117] "RemoveContainer" containerID="30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.339143 4962 scope.go:117] "RemoveContainer" containerID="de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d" Oct 03 13:29:18 crc kubenswrapper[4962]: E1003 13:29:18.339614 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d\": container with ID starting with de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d not found: ID does not exist" containerID="de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.339788 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d"} err="failed to get container status \"de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d\": rpc error: code = NotFound desc = could not find container \"de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d\": container with ID starting with de9314aa93d139f9a30ed44e77bca3a5904fb527ad576c52c43fa99a75e4434d not found: ID does not exist" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.339857 4962 scope.go:117] "RemoveContainer" containerID="4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c" Oct 03 13:29:18 crc kubenswrapper[4962]: E1003 13:29:18.340284 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c\": container with ID starting with 4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c not found: ID does not exist" containerID="4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.340320 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c"} err="failed to get container status \"4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c\": rpc error: code = NotFound desc = could not find container \"4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c\": container with ID starting with 4bd740c49af663115a75f489894b81c056f990e3fc68b55914cd7f59ffd5631c not found: ID does not exist" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.340343 4962 scope.go:117] "RemoveContainer" containerID="30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a" Oct 03 13:29:18 crc kubenswrapper[4962]: E1003 13:29:18.340768 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a\": container with ID starting with 30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a not found: ID does not exist" containerID="30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a" Oct 03 13:29:18 crc kubenswrapper[4962]: I1003 13:29:18.340790 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a"} err="failed to get container status \"30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a\": rpc error: code = NotFound desc = could not find container \"30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a\": container with ID starting with 30ba49df67a7f977a0e037071f9dff3f3918ae9bd45d3d3e0354a565e77b5d1a not found: ID does not exist" Oct 03 13:29:20 crc kubenswrapper[4962]: I1003 13:29:20.236899 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" path="/var/lib/kubelet/pods/e225514b-6f35-4a1d-8ca5-4dc0f3d13aab/volumes" Oct 03 13:29:25 crc kubenswrapper[4962]: I1003 13:29:25.226734 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:29:25 crc kubenswrapper[4962]: E1003 13:29:25.227225 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:29:38 crc kubenswrapper[4962]: I1003 13:29:38.227193 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:29:38 crc kubenswrapper[4962]: E1003 13:29:38.229037 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:29:51 crc kubenswrapper[4962]: I1003 13:29:51.227738 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:29:51 crc kubenswrapper[4962]: E1003 13:29:51.228722 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.156171 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28"] Oct 03 13:30:00 crc kubenswrapper[4962]: E1003 13:30:00.157058 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="extract-content" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.157073 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="extract-content" Oct 03 13:30:00 crc kubenswrapper[4962]: E1003 13:30:00.157091 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="extract-utilities" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.157097 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="extract-utilities" Oct 03 13:30:00 crc kubenswrapper[4962]: E1003 13:30:00.157115 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="registry-server" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.157124 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="registry-server" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.157266 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e225514b-6f35-4a1d-8ca5-4dc0f3d13aab" containerName="registry-server" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.157745 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.160079 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.160700 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.171199 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v99h\" (UniqueName: \"kubernetes.io/projected/4203666e-afcb-4f11-95ea-10adb4fd4940-kube-api-access-4v99h\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.171343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4203666e-afcb-4f11-95ea-10adb4fd4940-secret-volume\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.171404 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4203666e-afcb-4f11-95ea-10adb4fd4940-config-volume\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.172872 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28"] Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.272268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4203666e-afcb-4f11-95ea-10adb4fd4940-secret-volume\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.272337 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4203666e-afcb-4f11-95ea-10adb4fd4940-config-volume\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.272434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v99h\" (UniqueName: \"kubernetes.io/projected/4203666e-afcb-4f11-95ea-10adb4fd4940-kube-api-access-4v99h\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.273282 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4203666e-afcb-4f11-95ea-10adb4fd4940-config-volume\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.278464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4203666e-afcb-4f11-95ea-10adb4fd4940-secret-volume\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.289430 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v99h\" (UniqueName: \"kubernetes.io/projected/4203666e-afcb-4f11-95ea-10adb4fd4940-kube-api-access-4v99h\") pod \"collect-profiles-29324970-lmj28\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.489856 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:00 crc kubenswrapper[4962]: I1003 13:30:00.971519 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28"] Oct 03 13:30:01 crc kubenswrapper[4962]: I1003 13:30:01.614734 4962 generic.go:334] "Generic (PLEG): container finished" podID="4203666e-afcb-4f11-95ea-10adb4fd4940" containerID="a5a7730b2ce20725594b02065f1854b49e3adcc3f3a98cda84e108fc72e06a8b" exitCode=0 Oct 03 13:30:01 crc kubenswrapper[4962]: I1003 13:30:01.614818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" event={"ID":"4203666e-afcb-4f11-95ea-10adb4fd4940","Type":"ContainerDied","Data":"a5a7730b2ce20725594b02065f1854b49e3adcc3f3a98cda84e108fc72e06a8b"} Oct 03 13:30:01 crc kubenswrapper[4962]: I1003 13:30:01.615116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" event={"ID":"4203666e-afcb-4f11-95ea-10adb4fd4940","Type":"ContainerStarted","Data":"858aed28cffee1746c83f8ba9aeff1cf61c1a57612d13c2c6de2b6228c1e3e6f"} Oct 03 13:30:02 crc kubenswrapper[4962]: I1003 13:30:02.232095 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:30:02 crc kubenswrapper[4962]: E1003 13:30:02.232604 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:30:02 crc kubenswrapper[4962]: I1003 13:30:02.855920 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.016682 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4203666e-afcb-4f11-95ea-10adb4fd4940-secret-volume\") pod \"4203666e-afcb-4f11-95ea-10adb4fd4940\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.016740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4203666e-afcb-4f11-95ea-10adb4fd4940-config-volume\") pod \"4203666e-afcb-4f11-95ea-10adb4fd4940\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.016802 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v99h\" (UniqueName: \"kubernetes.io/projected/4203666e-afcb-4f11-95ea-10adb4fd4940-kube-api-access-4v99h\") pod \"4203666e-afcb-4f11-95ea-10adb4fd4940\" (UID: \"4203666e-afcb-4f11-95ea-10adb4fd4940\") " Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.017447 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4203666e-afcb-4f11-95ea-10adb4fd4940-config-volume" (OuterVolumeSpecName: "config-volume") pod "4203666e-afcb-4f11-95ea-10adb4fd4940" (UID: "4203666e-afcb-4f11-95ea-10adb4fd4940"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.022966 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4203666e-afcb-4f11-95ea-10adb4fd4940-kube-api-access-4v99h" (OuterVolumeSpecName: "kube-api-access-4v99h") pod "4203666e-afcb-4f11-95ea-10adb4fd4940" (UID: "4203666e-afcb-4f11-95ea-10adb4fd4940"). InnerVolumeSpecName "kube-api-access-4v99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.023016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4203666e-afcb-4f11-95ea-10adb4fd4940-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4203666e-afcb-4f11-95ea-10adb4fd4940" (UID: "4203666e-afcb-4f11-95ea-10adb4fd4940"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.118311 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4203666e-afcb-4f11-95ea-10adb4fd4940-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.118380 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4203666e-afcb-4f11-95ea-10adb4fd4940-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.118397 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v99h\" (UniqueName: \"kubernetes.io/projected/4203666e-afcb-4f11-95ea-10adb4fd4940-kube-api-access-4v99h\") on node \"crc\" DevicePath \"\"" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.631570 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" event={"ID":"4203666e-afcb-4f11-95ea-10adb4fd4940","Type":"ContainerDied","Data":"858aed28cffee1746c83f8ba9aeff1cf61c1a57612d13c2c6de2b6228c1e3e6f"} Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.631935 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="858aed28cffee1746c83f8ba9aeff1cf61c1a57612d13c2c6de2b6228c1e3e6f" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.631660 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28" Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.917314 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2"] Oct 03 13:30:03 crc kubenswrapper[4962]: I1003 13:30:03.921768 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324925-rjxv2"] Oct 03 13:30:04 crc kubenswrapper[4962]: I1003 13:30:04.242122 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a478f682-ff2f-4920-b535-24b1675ce2c7" path="/var/lib/kubelet/pods/a478f682-ff2f-4920-b535-24b1675ce2c7/volumes" Oct 03 13:30:15 crc kubenswrapper[4962]: I1003 13:30:15.227206 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:30:15 crc kubenswrapper[4962]: E1003 13:30:15.227961 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:30:29 crc kubenswrapper[4962]: I1003 13:30:29.227024 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:30:29 crc kubenswrapper[4962]: E1003 13:30:29.227621 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:30:42 crc kubenswrapper[4962]: I1003 13:30:42.231179 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:30:42 crc kubenswrapper[4962]: E1003 13:30:42.231965 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:30:53 crc kubenswrapper[4962]: I1003 13:30:53.227442 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:30:53 crc kubenswrapper[4962]: E1003 13:30:53.228149 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:30:59 crc kubenswrapper[4962]: I1003 13:30:59.100408 4962 scope.go:117] "RemoveContainer" containerID="f9d7a5e0fa6177245020e0234f9dc8d850b511ad38518ff97edcd25405891500" Oct 03 13:31:08 crc kubenswrapper[4962]: I1003 13:31:08.227389 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:31:08 crc kubenswrapper[4962]: E1003 13:31:08.230517 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.239567 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5qg67"] Oct 03 13:31:16 crc kubenswrapper[4962]: E1003 13:31:16.240921 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4203666e-afcb-4f11-95ea-10adb4fd4940" containerName="collect-profiles" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.240953 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4203666e-afcb-4f11-95ea-10adb4fd4940" containerName="collect-profiles" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.241373 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4203666e-afcb-4f11-95ea-10adb4fd4940" containerName="collect-profiles" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.244046 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.257092 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5qg67"] Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.363250 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-catalog-content\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.363605 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxm8h\" (UniqueName: \"kubernetes.io/projected/3841790d-8753-42d8-81d7-e5673fcb1afa-kube-api-access-xxm8h\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.363829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-utilities\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.465442 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-catalog-content\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.465548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxm8h\" (UniqueName: \"kubernetes.io/projected/3841790d-8753-42d8-81d7-e5673fcb1afa-kube-api-access-xxm8h\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.465580 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-utilities\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.466109 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-catalog-content\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.466178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-utilities\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.490593 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxm8h\" (UniqueName: \"kubernetes.io/projected/3841790d-8753-42d8-81d7-e5673fcb1afa-kube-api-access-xxm8h\") pod \"certified-operators-5qg67\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:16 crc kubenswrapper[4962]: I1003 13:31:16.572243 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:17 crc kubenswrapper[4962]: I1003 13:31:17.064508 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5qg67"] Oct 03 13:31:17 crc kubenswrapper[4962]: I1003 13:31:17.185084 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qg67" event={"ID":"3841790d-8753-42d8-81d7-e5673fcb1afa","Type":"ContainerStarted","Data":"2c8977c248e35e77ad8ebf6dffd6ea24db3367b27fc792bc7a08a11f58aac219"} Oct 03 13:31:18 crc kubenswrapper[4962]: I1003 13:31:18.195446 4962 generic.go:334] "Generic (PLEG): container finished" podID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerID="257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809" exitCode=0 Oct 03 13:31:18 crc kubenswrapper[4962]: I1003 13:31:18.195712 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qg67" event={"ID":"3841790d-8753-42d8-81d7-e5673fcb1afa","Type":"ContainerDied","Data":"257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809"} Oct 03 13:31:20 crc kubenswrapper[4962]: I1003 13:31:20.212862 4962 generic.go:334] "Generic (PLEG): container finished" podID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerID="74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06" exitCode=0 Oct 03 13:31:20 crc kubenswrapper[4962]: I1003 13:31:20.213151 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qg67" event={"ID":"3841790d-8753-42d8-81d7-e5673fcb1afa","Type":"ContainerDied","Data":"74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06"} Oct 03 13:31:21 crc kubenswrapper[4962]: I1003 13:31:21.227132 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:31:21 crc kubenswrapper[4962]: E1003 13:31:21.227548 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:31:22 crc kubenswrapper[4962]: I1003 13:31:22.235324 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qg67" event={"ID":"3841790d-8753-42d8-81d7-e5673fcb1afa","Type":"ContainerStarted","Data":"f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683"} Oct 03 13:31:23 crc kubenswrapper[4962]: I1003 13:31:23.260697 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5qg67" podStartSLOduration=3.754372585 podStartE2EDuration="7.260678315s" podCreationTimestamp="2025-10-03 13:31:16 +0000 UTC" firstStartedPulling="2025-10-03 13:31:18.197721346 +0000 UTC m=+2486.601619181" lastFinishedPulling="2025-10-03 13:31:21.704027076 +0000 UTC m=+2490.107924911" observedRunningTime="2025-10-03 13:31:23.258771355 +0000 UTC m=+2491.662669210" watchObservedRunningTime="2025-10-03 13:31:23.260678315 +0000 UTC m=+2491.664576150" Oct 03 13:31:26 crc kubenswrapper[4962]: I1003 13:31:26.572929 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:26 crc kubenswrapper[4962]: I1003 13:31:26.573366 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:26 crc kubenswrapper[4962]: I1003 13:31:26.613359 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:27 crc kubenswrapper[4962]: I1003 13:31:27.318002 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:27 crc kubenswrapper[4962]: I1003 13:31:27.359370 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5qg67"] Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.289163 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5qg67" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerName="registry-server" containerID="cri-o://f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683" gracePeriod=2 Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.736230 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.845097 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-utilities\") pod \"3841790d-8753-42d8-81d7-e5673fcb1afa\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.845137 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxm8h\" (UniqueName: \"kubernetes.io/projected/3841790d-8753-42d8-81d7-e5673fcb1afa-kube-api-access-xxm8h\") pod \"3841790d-8753-42d8-81d7-e5673fcb1afa\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.845207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-catalog-content\") pod \"3841790d-8753-42d8-81d7-e5673fcb1afa\" (UID: \"3841790d-8753-42d8-81d7-e5673fcb1afa\") " Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.846604 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-utilities" (OuterVolumeSpecName: "utilities") pod "3841790d-8753-42d8-81d7-e5673fcb1afa" (UID: "3841790d-8753-42d8-81d7-e5673fcb1afa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.853360 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3841790d-8753-42d8-81d7-e5673fcb1afa-kube-api-access-xxm8h" (OuterVolumeSpecName: "kube-api-access-xxm8h") pod "3841790d-8753-42d8-81d7-e5673fcb1afa" (UID: "3841790d-8753-42d8-81d7-e5673fcb1afa"). InnerVolumeSpecName "kube-api-access-xxm8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.891488 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3841790d-8753-42d8-81d7-e5673fcb1afa" (UID: "3841790d-8753-42d8-81d7-e5673fcb1afa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.947188 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.947229 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3841790d-8753-42d8-81d7-e5673fcb1afa-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:31:29 crc kubenswrapper[4962]: I1003 13:31:29.947243 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxm8h\" (UniqueName: \"kubernetes.io/projected/3841790d-8753-42d8-81d7-e5673fcb1afa-kube-api-access-xxm8h\") on node \"crc\" DevicePath \"\"" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.297838 4962 generic.go:334] "Generic (PLEG): container finished" podID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerID="f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683" exitCode=0 Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.297874 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qg67" event={"ID":"3841790d-8753-42d8-81d7-e5673fcb1afa","Type":"ContainerDied","Data":"f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683"} Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.297914 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qg67" event={"ID":"3841790d-8753-42d8-81d7-e5673fcb1afa","Type":"ContainerDied","Data":"2c8977c248e35e77ad8ebf6dffd6ea24db3367b27fc792bc7a08a11f58aac219"} Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.297923 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qg67" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.297936 4962 scope.go:117] "RemoveContainer" containerID="f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.320985 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5qg67"] Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.326335 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5qg67"] Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.328891 4962 scope.go:117] "RemoveContainer" containerID="74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.353113 4962 scope.go:117] "RemoveContainer" containerID="257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.374726 4962 scope.go:117] "RemoveContainer" containerID="f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683" Oct 03 13:31:30 crc kubenswrapper[4962]: E1003 13:31:30.375182 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683\": container with ID starting with f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683 not found: ID does not exist" containerID="f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.375252 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683"} err="failed to get container status \"f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683\": rpc error: code = NotFound desc = could not find container \"f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683\": container with ID starting with f37ac8ce3691ebefa3100826f23b260f00b3c89d74562d9cc151320559a71683 not found: ID does not exist" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.375282 4962 scope.go:117] "RemoveContainer" containerID="74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06" Oct 03 13:31:30 crc kubenswrapper[4962]: E1003 13:31:30.375831 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06\": container with ID starting with 74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06 not found: ID does not exist" containerID="74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.375865 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06"} err="failed to get container status \"74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06\": rpc error: code = NotFound desc = could not find container \"74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06\": container with ID starting with 74c7d6a445c8513e25da9a775d5e705330397c30f4630ac68de9570dc58fcc06 not found: ID does not exist" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.375883 4962 scope.go:117] "RemoveContainer" containerID="257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809" Oct 03 13:31:30 crc kubenswrapper[4962]: E1003 13:31:30.376713 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809\": container with ID starting with 257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809 not found: ID does not exist" containerID="257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809" Oct 03 13:31:30 crc kubenswrapper[4962]: I1003 13:31:30.376737 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809"} err="failed to get container status \"257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809\": rpc error: code = NotFound desc = could not find container \"257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809\": container with ID starting with 257f95c71c743853163ad3657719e84241d4e6c517ffbf36017695f9d6292809 not found: ID does not exist" Oct 03 13:31:32 crc kubenswrapper[4962]: I1003 13:31:32.234897 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" path="/var/lib/kubelet/pods/3841790d-8753-42d8-81d7-e5673fcb1afa/volumes" Oct 03 13:31:33 crc kubenswrapper[4962]: I1003 13:31:33.227149 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:31:33 crc kubenswrapper[4962]: E1003 13:31:33.227433 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:31:44 crc kubenswrapper[4962]: I1003 13:31:44.227934 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:31:44 crc kubenswrapper[4962]: E1003 13:31:44.229073 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:31:55 crc kubenswrapper[4962]: I1003 13:31:55.226756 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:31:56 crc kubenswrapper[4962]: I1003 13:31:56.529811 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"e5601307ec7160d1684f440881caedd8dbb40f586b37a314215dcb3945b363f2"} Oct 03 13:31:59 crc kubenswrapper[4962]: I1003 13:31:59.148721 4962 scope.go:117] "RemoveContainer" containerID="64ee9cfe62a27f42499f16eb2220f56ee158f0fdaaccf7edb78542dbbff76843" Oct 03 13:31:59 crc kubenswrapper[4962]: I1003 13:31:59.167145 4962 scope.go:117] "RemoveContainer" containerID="f300d881e56fe58382dd259f46f5580db864b608b15deab74f6e7cff8b07784d" Oct 03 13:31:59 crc kubenswrapper[4962]: I1003 13:31:59.193759 4962 scope.go:117] "RemoveContainer" containerID="5defaf3f1094fc059ff340c199e3021058b624b4855ec1b532b1657ca60ea28a" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:19.999406 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p99zb"] Oct 03 13:34:20 crc kubenswrapper[4962]: E1003 13:34:20.000738 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerName="extract-content" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.000758 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerName="extract-content" Oct 03 13:34:20 crc kubenswrapper[4962]: E1003 13:34:20.000774 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerName="registry-server" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.000782 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerName="registry-server" Oct 03 13:34:20 crc kubenswrapper[4962]: E1003 13:34:20.000798 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerName="extract-utilities" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.000806 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerName="extract-utilities" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.001116 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3841790d-8753-42d8-81d7-e5673fcb1afa" containerName="registry-server" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.003491 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.024590 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p99zb"] Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.167494 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2994\" (UniqueName: \"kubernetes.io/projected/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-kube-api-access-n2994\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.167568 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-utilities\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.167706 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-catalog-content\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.268900 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2994\" (UniqueName: \"kubernetes.io/projected/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-kube-api-access-n2994\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.268974 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-utilities\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.269038 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-catalog-content\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.269504 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-catalog-content\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.269574 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-utilities\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.290507 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2994\" (UniqueName: \"kubernetes.io/projected/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-kube-api-access-n2994\") pod \"redhat-marketplace-p99zb\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.332431 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:20 crc kubenswrapper[4962]: I1003 13:34:20.774187 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p99zb"] Oct 03 13:34:21 crc kubenswrapper[4962]: I1003 13:34:21.660111 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerID="ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73" exitCode=0 Oct 03 13:34:21 crc kubenswrapper[4962]: I1003 13:34:21.660160 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p99zb" event={"ID":"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772","Type":"ContainerDied","Data":"ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73"} Oct 03 13:34:21 crc kubenswrapper[4962]: I1003 13:34:21.660190 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p99zb" event={"ID":"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772","Type":"ContainerStarted","Data":"8427dd391897506e926d3934a1513ca89c6be6324cbabdff3d16df3a39a3e4fb"} Oct 03 13:34:21 crc kubenswrapper[4962]: I1003 13:34:21.662783 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:34:22 crc kubenswrapper[4962]: I1003 13:34:22.671442 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p99zb" event={"ID":"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772","Type":"ContainerStarted","Data":"cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f"} Oct 03 13:34:23 crc kubenswrapper[4962]: I1003 13:34:23.682153 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerID="cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f" exitCode=0 Oct 03 13:34:23 crc kubenswrapper[4962]: I1003 13:34:23.682255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p99zb" event={"ID":"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772","Type":"ContainerDied","Data":"cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f"} Oct 03 13:34:24 crc kubenswrapper[4962]: I1003 13:34:24.659764 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:34:24 crc kubenswrapper[4962]: I1003 13:34:24.659854 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:34:24 crc kubenswrapper[4962]: I1003 13:34:24.695544 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p99zb" event={"ID":"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772","Type":"ContainerStarted","Data":"4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be"} Oct 03 13:34:24 crc kubenswrapper[4962]: I1003 13:34:24.711948 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p99zb" podStartSLOduration=3.285973088 podStartE2EDuration="5.711926456s" podCreationTimestamp="2025-10-03 13:34:19 +0000 UTC" firstStartedPulling="2025-10-03 13:34:21.662395281 +0000 UTC m=+2670.066293136" lastFinishedPulling="2025-10-03 13:34:24.088348669 +0000 UTC m=+2672.492246504" observedRunningTime="2025-10-03 13:34:24.710413737 +0000 UTC m=+2673.114311572" watchObservedRunningTime="2025-10-03 13:34:24.711926456 +0000 UTC m=+2673.115824291" Oct 03 13:34:30 crc kubenswrapper[4962]: I1003 13:34:30.333090 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:30 crc kubenswrapper[4962]: I1003 13:34:30.333558 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:30 crc kubenswrapper[4962]: I1003 13:34:30.384508 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:30 crc kubenswrapper[4962]: I1003 13:34:30.789593 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:30 crc kubenswrapper[4962]: I1003 13:34:30.846665 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p99zb"] Oct 03 13:34:32 crc kubenswrapper[4962]: I1003 13:34:32.759333 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p99zb" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerName="registry-server" containerID="cri-o://4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be" gracePeriod=2 Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.174429 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.264038 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-utilities\") pod \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.264108 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2994\" (UniqueName: \"kubernetes.io/projected/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-kube-api-access-n2994\") pod \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.264181 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-catalog-content\") pod \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\" (UID: \"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772\") " Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.265215 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-utilities" (OuterVolumeSpecName: "utilities") pod "2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" (UID: "2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.274256 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-kube-api-access-n2994" (OuterVolumeSpecName: "kube-api-access-n2994") pod "2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" (UID: "2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772"). InnerVolumeSpecName "kube-api-access-n2994". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.278029 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" (UID: "2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.366106 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.366146 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2994\" (UniqueName: \"kubernetes.io/projected/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-kube-api-access-n2994\") on node \"crc\" DevicePath \"\"" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.366155 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.769833 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerID="4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be" exitCode=0 Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.769896 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p99zb" event={"ID":"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772","Type":"ContainerDied","Data":"4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be"} Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.769949 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p99zb" event={"ID":"2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772","Type":"ContainerDied","Data":"8427dd391897506e926d3934a1513ca89c6be6324cbabdff3d16df3a39a3e4fb"} Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.769981 4962 scope.go:117] "RemoveContainer" containerID="4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.770007 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p99zb" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.813785 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p99zb"] Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.814024 4962 scope.go:117] "RemoveContainer" containerID="cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.821074 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p99zb"] Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.865021 4962 scope.go:117] "RemoveContainer" containerID="ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.894538 4962 scope.go:117] "RemoveContainer" containerID="4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be" Oct 03 13:34:33 crc kubenswrapper[4962]: E1003 13:34:33.895119 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be\": container with ID starting with 4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be not found: ID does not exist" containerID="4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.895155 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be"} err="failed to get container status \"4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be\": rpc error: code = NotFound desc = could not find container \"4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be\": container with ID starting with 4d08424544de9347c37ae2bdcadd1555f2a9160c682b7910a097d9891cb157be not found: ID does not exist" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.895178 4962 scope.go:117] "RemoveContainer" containerID="cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f" Oct 03 13:34:33 crc kubenswrapper[4962]: E1003 13:34:33.895623 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f\": container with ID starting with cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f not found: ID does not exist" containerID="cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.895719 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f"} err="failed to get container status \"cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f\": rpc error: code = NotFound desc = could not find container \"cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f\": container with ID starting with cfc06832456b5262599996a4ad050e1aebbf8b738c4390dfebbc96968013d12f not found: ID does not exist" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.895778 4962 scope.go:117] "RemoveContainer" containerID="ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73" Oct 03 13:34:33 crc kubenswrapper[4962]: E1003 13:34:33.896162 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73\": container with ID starting with ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73 not found: ID does not exist" containerID="ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73" Oct 03 13:34:33 crc kubenswrapper[4962]: I1003 13:34:33.896206 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73"} err="failed to get container status \"ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73\": rpc error: code = NotFound desc = could not find container \"ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73\": container with ID starting with ac57e140e015666d4bb995eb6b47fffa1940fe3b84057274e74ff66415e4dc73 not found: ID does not exist" Oct 03 13:34:34 crc kubenswrapper[4962]: I1003 13:34:34.239870 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" path="/var/lib/kubelet/pods/2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772/volumes" Oct 03 13:34:54 crc kubenswrapper[4962]: I1003 13:34:54.661189 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:34:54 crc kubenswrapper[4962]: I1003 13:34:54.661873 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:35:24 crc kubenswrapper[4962]: I1003 13:35:24.660188 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:35:24 crc kubenswrapper[4962]: I1003 13:35:24.661161 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:35:24 crc kubenswrapper[4962]: I1003 13:35:24.661259 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:35:24 crc kubenswrapper[4962]: I1003 13:35:24.662720 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5601307ec7160d1684f440881caedd8dbb40f586b37a314215dcb3945b363f2"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:35:24 crc kubenswrapper[4962]: I1003 13:35:24.663780 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://e5601307ec7160d1684f440881caedd8dbb40f586b37a314215dcb3945b363f2" gracePeriod=600 Oct 03 13:35:25 crc kubenswrapper[4962]: I1003 13:35:25.205132 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="e5601307ec7160d1684f440881caedd8dbb40f586b37a314215dcb3945b363f2" exitCode=0 Oct 03 13:35:25 crc kubenswrapper[4962]: I1003 13:35:25.205235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"e5601307ec7160d1684f440881caedd8dbb40f586b37a314215dcb3945b363f2"} Oct 03 13:35:25 crc kubenswrapper[4962]: I1003 13:35:25.205533 4962 scope.go:117] "RemoveContainer" containerID="435ccd24f7e61eca9f3a50b20a45f04e1ed70ff0737ebdb93094248169075ae3" Oct 03 13:35:26 crc kubenswrapper[4962]: I1003 13:35:26.224777 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99"} Oct 03 13:37:54 crc kubenswrapper[4962]: I1003 13:37:54.664117 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:37:54 crc kubenswrapper[4962]: I1003 13:37:54.665606 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.414459 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6tl4m"] Oct 03 13:38:23 crc kubenswrapper[4962]: E1003 13:38:23.415353 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerName="extract-content" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.415371 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerName="extract-content" Oct 03 13:38:23 crc kubenswrapper[4962]: E1003 13:38:23.415395 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerName="extract-utilities" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.415404 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerName="extract-utilities" Oct 03 13:38:23 crc kubenswrapper[4962]: E1003 13:38:23.415428 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerName="registry-server" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.415435 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerName="registry-server" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.415612 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5366ae-b9ae-4dc7-aa9b-c8136a8d1772" containerName="registry-server" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.417307 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.420516 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6tl4m"] Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.592327 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-catalog-content\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.592828 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-utilities\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.593047 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwxz2\" (UniqueName: \"kubernetes.io/projected/5d5b951e-acc6-407e-9992-58fd92bf1648-kube-api-access-bwxz2\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.694164 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-catalog-content\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.694504 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-utilities\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.694663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwxz2\" (UniqueName: \"kubernetes.io/projected/5d5b951e-acc6-407e-9992-58fd92bf1648-kube-api-access-bwxz2\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.694709 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-catalog-content\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.695020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-utilities\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.713518 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwxz2\" (UniqueName: \"kubernetes.io/projected/5d5b951e-acc6-407e-9992-58fd92bf1648-kube-api-access-bwxz2\") pod \"community-operators-6tl4m\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:23 crc kubenswrapper[4962]: I1003 13:38:23.735542 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:24 crc kubenswrapper[4962]: I1003 13:38:24.213322 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6tl4m"] Oct 03 13:38:24 crc kubenswrapper[4962]: I1003 13:38:24.660309 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:38:24 crc kubenswrapper[4962]: I1003 13:38:24.660883 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:38:24 crc kubenswrapper[4962]: I1003 13:38:24.735660 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerID="b004d4535dd26e2b8b542d0c01d4673dd2b07c674907a409d9c41127c5c8117c" exitCode=0 Oct 03 13:38:24 crc kubenswrapper[4962]: I1003 13:38:24.735725 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6tl4m" event={"ID":"5d5b951e-acc6-407e-9992-58fd92bf1648","Type":"ContainerDied","Data":"b004d4535dd26e2b8b542d0c01d4673dd2b07c674907a409d9c41127c5c8117c"} Oct 03 13:38:24 crc kubenswrapper[4962]: I1003 13:38:24.735763 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6tl4m" event={"ID":"5d5b951e-acc6-407e-9992-58fd92bf1648","Type":"ContainerStarted","Data":"38f60fd4ea0832cb5a1ac786dd1ebb0a57b0e09daf5e61a3807966609955cc82"} Oct 03 13:38:25 crc kubenswrapper[4962]: I1003 13:38:25.745014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6tl4m" event={"ID":"5d5b951e-acc6-407e-9992-58fd92bf1648","Type":"ContainerStarted","Data":"dc12ae6dc81c17a4c06eb51922af12c3d36b2561ad1011162e91b495703f9121"} Oct 03 13:38:26 crc kubenswrapper[4962]: I1003 13:38:26.757260 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerID="dc12ae6dc81c17a4c06eb51922af12c3d36b2561ad1011162e91b495703f9121" exitCode=0 Oct 03 13:38:26 crc kubenswrapper[4962]: I1003 13:38:26.757320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6tl4m" event={"ID":"5d5b951e-acc6-407e-9992-58fd92bf1648","Type":"ContainerDied","Data":"dc12ae6dc81c17a4c06eb51922af12c3d36b2561ad1011162e91b495703f9121"} Oct 03 13:38:27 crc kubenswrapper[4962]: I1003 13:38:27.772814 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6tl4m" event={"ID":"5d5b951e-acc6-407e-9992-58fd92bf1648","Type":"ContainerStarted","Data":"24b34849c076c11411c3fd5c27ccadaf427f498e7647f30cfe73b993ef22d2b5"} Oct 03 13:38:27 crc kubenswrapper[4962]: I1003 13:38:27.804072 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6tl4m" podStartSLOduration=2.226210069 podStartE2EDuration="4.804048599s" podCreationTimestamp="2025-10-03 13:38:23 +0000 UTC" firstStartedPulling="2025-10-03 13:38:24.73873628 +0000 UTC m=+2913.142634135" lastFinishedPulling="2025-10-03 13:38:27.31657483 +0000 UTC m=+2915.720472665" observedRunningTime="2025-10-03 13:38:27.795406193 +0000 UTC m=+2916.199304038" watchObservedRunningTime="2025-10-03 13:38:27.804048599 +0000 UTC m=+2916.207946464" Oct 03 13:38:33 crc kubenswrapper[4962]: I1003 13:38:33.736519 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:33 crc kubenswrapper[4962]: I1003 13:38:33.736978 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:33 crc kubenswrapper[4962]: I1003 13:38:33.786793 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:33 crc kubenswrapper[4962]: I1003 13:38:33.867335 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:36 crc kubenswrapper[4962]: I1003 13:38:36.184251 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6tl4m"] Oct 03 13:38:36 crc kubenswrapper[4962]: I1003 13:38:36.186248 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6tl4m" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerName="registry-server" containerID="cri-o://24b34849c076c11411c3fd5c27ccadaf427f498e7647f30cfe73b993ef22d2b5" gracePeriod=2 Oct 03 13:38:36 crc kubenswrapper[4962]: I1003 13:38:36.846248 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerID="24b34849c076c11411c3fd5c27ccadaf427f498e7647f30cfe73b993ef22d2b5" exitCode=0 Oct 03 13:38:36 crc kubenswrapper[4962]: I1003 13:38:36.846693 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6tl4m" event={"ID":"5d5b951e-acc6-407e-9992-58fd92bf1648","Type":"ContainerDied","Data":"24b34849c076c11411c3fd5c27ccadaf427f498e7647f30cfe73b993ef22d2b5"} Oct 03 13:38:36 crc kubenswrapper[4962]: I1003 13:38:36.846726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6tl4m" event={"ID":"5d5b951e-acc6-407e-9992-58fd92bf1648","Type":"ContainerDied","Data":"38f60fd4ea0832cb5a1ac786dd1ebb0a57b0e09daf5e61a3807966609955cc82"} Oct 03 13:38:36 crc kubenswrapper[4962]: I1003 13:38:36.846739 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f60fd4ea0832cb5a1ac786dd1ebb0a57b0e09daf5e61a3807966609955cc82" Oct 03 13:38:36 crc kubenswrapper[4962]: I1003 13:38:36.890383 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.017733 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-utilities\") pod \"5d5b951e-acc6-407e-9992-58fd92bf1648\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.017921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwxz2\" (UniqueName: \"kubernetes.io/projected/5d5b951e-acc6-407e-9992-58fd92bf1648-kube-api-access-bwxz2\") pod \"5d5b951e-acc6-407e-9992-58fd92bf1648\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.018186 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-catalog-content\") pod \"5d5b951e-acc6-407e-9992-58fd92bf1648\" (UID: \"5d5b951e-acc6-407e-9992-58fd92bf1648\") " Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.019258 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-utilities" (OuterVolumeSpecName: "utilities") pod "5d5b951e-acc6-407e-9992-58fd92bf1648" (UID: "5d5b951e-acc6-407e-9992-58fd92bf1648"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.026016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5b951e-acc6-407e-9992-58fd92bf1648-kube-api-access-bwxz2" (OuterVolumeSpecName: "kube-api-access-bwxz2") pod "5d5b951e-acc6-407e-9992-58fd92bf1648" (UID: "5d5b951e-acc6-407e-9992-58fd92bf1648"). InnerVolumeSpecName "kube-api-access-bwxz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.094855 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d5b951e-acc6-407e-9992-58fd92bf1648" (UID: "5d5b951e-acc6-407e-9992-58fd92bf1648"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.120291 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwxz2\" (UniqueName: \"kubernetes.io/projected/5d5b951e-acc6-407e-9992-58fd92bf1648-kube-api-access-bwxz2\") on node \"crc\" DevicePath \"\"" Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.120357 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.120378 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b951e-acc6-407e-9992-58fd92bf1648-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.852519 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6tl4m" Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.886605 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6tl4m"] Oct 03 13:38:37 crc kubenswrapper[4962]: I1003 13:38:37.891142 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6tl4m"] Oct 03 13:38:38 crc kubenswrapper[4962]: I1003 13:38:38.235599 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" path="/var/lib/kubelet/pods/5d5b951e-acc6-407e-9992-58fd92bf1648/volumes" Oct 03 13:38:54 crc kubenswrapper[4962]: I1003 13:38:54.660281 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:38:54 crc kubenswrapper[4962]: I1003 13:38:54.661058 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:38:54 crc kubenswrapper[4962]: I1003 13:38:54.661136 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:38:54 crc kubenswrapper[4962]: I1003 13:38:54.662170 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:38:54 crc kubenswrapper[4962]: I1003 13:38:54.662280 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" gracePeriod=600 Oct 03 13:38:54 crc kubenswrapper[4962]: E1003 13:38:54.790169 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:38:55 crc kubenswrapper[4962]: I1003 13:38:55.044179 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" exitCode=0 Oct 03 13:38:55 crc kubenswrapper[4962]: I1003 13:38:55.044264 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99"} Oct 03 13:38:55 crc kubenswrapper[4962]: I1003 13:38:55.044358 4962 scope.go:117] "RemoveContainer" containerID="e5601307ec7160d1684f440881caedd8dbb40f586b37a314215dcb3945b363f2" Oct 03 13:38:55 crc kubenswrapper[4962]: I1003 13:38:55.045155 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:38:55 crc kubenswrapper[4962]: E1003 13:38:55.045586 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:39:07 crc kubenswrapper[4962]: I1003 13:39:07.226590 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:39:07 crc kubenswrapper[4962]: E1003 13:39:07.227526 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:39:19 crc kubenswrapper[4962]: I1003 13:39:19.227415 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:39:19 crc kubenswrapper[4962]: E1003 13:39:19.228927 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:39:33 crc kubenswrapper[4962]: I1003 13:39:33.227050 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:39:33 crc kubenswrapper[4962]: E1003 13:39:33.227936 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:39:48 crc kubenswrapper[4962]: I1003 13:39:48.227129 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:39:48 crc kubenswrapper[4962]: E1003 13:39:48.227912 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:39:53 crc kubenswrapper[4962]: I1003 13:39:53.997735 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-48d4q"] Oct 03 13:39:54 crc kubenswrapper[4962]: E1003 13:39:54.000052 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerName="extract-content" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.000169 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerName="extract-content" Oct 03 13:39:54 crc kubenswrapper[4962]: E1003 13:39:54.000275 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerName="extract-utilities" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.000363 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerName="extract-utilities" Oct 03 13:39:54 crc kubenswrapper[4962]: E1003 13:39:54.000445 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerName="registry-server" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.000525 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerName="registry-server" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.000827 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5b951e-acc6-407e-9992-58fd92bf1648" containerName="registry-server" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.001841 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.004575 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-48d4q"] Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.170056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-catalog-content\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.170373 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7cm\" (UniqueName: \"kubernetes.io/projected/a4591872-3d00-4662-a7b2-622298b46b5f-kube-api-access-ck7cm\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.170572 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-utilities\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.271106 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-utilities\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.271166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-catalog-content\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.271207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7cm\" (UniqueName: \"kubernetes.io/projected/a4591872-3d00-4662-a7b2-622298b46b5f-kube-api-access-ck7cm\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.271710 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-utilities\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.272074 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-catalog-content\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.293540 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7cm\" (UniqueName: \"kubernetes.io/projected/a4591872-3d00-4662-a7b2-622298b46b5f-kube-api-access-ck7cm\") pod \"redhat-operators-48d4q\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.321068 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:39:54 crc kubenswrapper[4962]: I1003 13:39:54.755944 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-48d4q"] Oct 03 13:39:55 crc kubenswrapper[4962]: I1003 13:39:55.559287 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4591872-3d00-4662-a7b2-622298b46b5f" containerID="d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a" exitCode=0 Oct 03 13:39:55 crc kubenswrapper[4962]: I1003 13:39:55.559522 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4q" event={"ID":"a4591872-3d00-4662-a7b2-622298b46b5f","Type":"ContainerDied","Data":"d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a"} Oct 03 13:39:55 crc kubenswrapper[4962]: I1003 13:39:55.560675 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4q" event={"ID":"a4591872-3d00-4662-a7b2-622298b46b5f","Type":"ContainerStarted","Data":"840631e2e4336c4aaa1e48540f1da09f62820c379991bdf65def134eeb918ae2"} Oct 03 13:39:55 crc kubenswrapper[4962]: I1003 13:39:55.561222 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:39:56 crc kubenswrapper[4962]: I1003 13:39:56.590497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4q" event={"ID":"a4591872-3d00-4662-a7b2-622298b46b5f","Type":"ContainerStarted","Data":"514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f"} Oct 03 13:39:57 crc kubenswrapper[4962]: I1003 13:39:57.599351 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4591872-3d00-4662-a7b2-622298b46b5f" containerID="514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f" exitCode=0 Oct 03 13:39:57 crc kubenswrapper[4962]: I1003 13:39:57.599402 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4q" event={"ID":"a4591872-3d00-4662-a7b2-622298b46b5f","Type":"ContainerDied","Data":"514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f"} Oct 03 13:39:58 crc kubenswrapper[4962]: I1003 13:39:58.610363 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4q" event={"ID":"a4591872-3d00-4662-a7b2-622298b46b5f","Type":"ContainerStarted","Data":"d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd"} Oct 03 13:39:59 crc kubenswrapper[4962]: I1003 13:39:59.226757 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:39:59 crc kubenswrapper[4962]: E1003 13:39:59.226987 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:40:04 crc kubenswrapper[4962]: I1003 13:40:04.321577 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:40:04 crc kubenswrapper[4962]: I1003 13:40:04.322181 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:40:04 crc kubenswrapper[4962]: I1003 13:40:04.366148 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:40:04 crc kubenswrapper[4962]: I1003 13:40:04.388039 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-48d4q" podStartSLOduration=8.914026014000001 podStartE2EDuration="11.388017166s" podCreationTimestamp="2025-10-03 13:39:53 +0000 UTC" firstStartedPulling="2025-10-03 13:39:55.560975853 +0000 UTC m=+3003.964873688" lastFinishedPulling="2025-10-03 13:39:58.034967005 +0000 UTC m=+3006.438864840" observedRunningTime="2025-10-03 13:39:58.641414144 +0000 UTC m=+3007.045312009" watchObservedRunningTime="2025-10-03 13:40:04.388017166 +0000 UTC m=+3012.791915001" Oct 03 13:40:04 crc kubenswrapper[4962]: I1003 13:40:04.687479 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:40:04 crc kubenswrapper[4962]: I1003 13:40:04.731364 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-48d4q"] Oct 03 13:40:06 crc kubenswrapper[4962]: I1003 13:40:06.661490 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-48d4q" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" containerName="registry-server" containerID="cri-o://d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd" gracePeriod=2 Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.035528 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.162091 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-utilities\") pod \"a4591872-3d00-4662-a7b2-622298b46b5f\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.162449 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck7cm\" (UniqueName: \"kubernetes.io/projected/a4591872-3d00-4662-a7b2-622298b46b5f-kube-api-access-ck7cm\") pod \"a4591872-3d00-4662-a7b2-622298b46b5f\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.162580 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-catalog-content\") pod \"a4591872-3d00-4662-a7b2-622298b46b5f\" (UID: \"a4591872-3d00-4662-a7b2-622298b46b5f\") " Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.163398 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-utilities" (OuterVolumeSpecName: "utilities") pod "a4591872-3d00-4662-a7b2-622298b46b5f" (UID: "a4591872-3d00-4662-a7b2-622298b46b5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.174102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4591872-3d00-4662-a7b2-622298b46b5f-kube-api-access-ck7cm" (OuterVolumeSpecName: "kube-api-access-ck7cm") pod "a4591872-3d00-4662-a7b2-622298b46b5f" (UID: "a4591872-3d00-4662-a7b2-622298b46b5f"). InnerVolumeSpecName "kube-api-access-ck7cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.257749 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4591872-3d00-4662-a7b2-622298b46b5f" (UID: "a4591872-3d00-4662-a7b2-622298b46b5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.264164 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.264218 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4591872-3d00-4662-a7b2-622298b46b5f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.264235 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck7cm\" (UniqueName: \"kubernetes.io/projected/a4591872-3d00-4662-a7b2-622298b46b5f-kube-api-access-ck7cm\") on node \"crc\" DevicePath \"\"" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.671590 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4591872-3d00-4662-a7b2-622298b46b5f" containerID="d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd" exitCode=0 Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.671777 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48d4q" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.671834 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4q" event={"ID":"a4591872-3d00-4662-a7b2-622298b46b5f","Type":"ContainerDied","Data":"d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd"} Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.675540 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48d4q" event={"ID":"a4591872-3d00-4662-a7b2-622298b46b5f","Type":"ContainerDied","Data":"840631e2e4336c4aaa1e48540f1da09f62820c379991bdf65def134eeb918ae2"} Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.675577 4962 scope.go:117] "RemoveContainer" containerID="d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.708254 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-48d4q"] Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.712662 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-48d4q"] Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.721679 4962 scope.go:117] "RemoveContainer" containerID="514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.750056 4962 scope.go:117] "RemoveContainer" containerID="d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.766891 4962 scope.go:117] "RemoveContainer" containerID="d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd" Oct 03 13:40:07 crc kubenswrapper[4962]: E1003 13:40:07.767482 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd\": container with ID starting with d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd not found: ID does not exist" containerID="d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.767532 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd"} err="failed to get container status \"d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd\": rpc error: code = NotFound desc = could not find container \"d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd\": container with ID starting with d0b470f2f7d4bcafa6033adb7ebbed18d9ab28f7111fe6ec02de0c66a67ba2bd not found: ID does not exist" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.767568 4962 scope.go:117] "RemoveContainer" containerID="514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f" Oct 03 13:40:07 crc kubenswrapper[4962]: E1003 13:40:07.768166 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f\": container with ID starting with 514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f not found: ID does not exist" containerID="514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.768264 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f"} err="failed to get container status \"514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f\": rpc error: code = NotFound desc = could not find container \"514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f\": container with ID starting with 514e74ba83438dfe6c5cb2d28edfa2591f634e2fa759f5723bd5112cd424f64f not found: ID does not exist" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.768353 4962 scope.go:117] "RemoveContainer" containerID="d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a" Oct 03 13:40:07 crc kubenswrapper[4962]: E1003 13:40:07.768856 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a\": container with ID starting with d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a not found: ID does not exist" containerID="d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a" Oct 03 13:40:07 crc kubenswrapper[4962]: I1003 13:40:07.768891 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a"} err="failed to get container status \"d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a\": rpc error: code = NotFound desc = could not find container \"d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a\": container with ID starting with d8b385133739a745888ec549caf4aacb70628157c12f863a2bfb512ad8df792a not found: ID does not exist" Oct 03 13:40:08 crc kubenswrapper[4962]: I1003 13:40:08.235955 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" path="/var/lib/kubelet/pods/a4591872-3d00-4662-a7b2-622298b46b5f/volumes" Oct 03 13:40:12 crc kubenswrapper[4962]: I1003 13:40:12.235577 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:40:12 crc kubenswrapper[4962]: E1003 13:40:12.236491 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:40:27 crc kubenswrapper[4962]: I1003 13:40:27.227474 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:40:27 crc kubenswrapper[4962]: E1003 13:40:27.228668 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:40:39 crc kubenswrapper[4962]: I1003 13:40:39.227284 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:40:39 crc kubenswrapper[4962]: E1003 13:40:39.227990 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:40:54 crc kubenswrapper[4962]: I1003 13:40:54.227523 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:40:54 crc kubenswrapper[4962]: E1003 13:40:54.228533 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:41:09 crc kubenswrapper[4962]: I1003 13:41:09.226713 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:41:09 crc kubenswrapper[4962]: E1003 13:41:09.227391 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:41:24 crc kubenswrapper[4962]: I1003 13:41:24.228244 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:41:24 crc kubenswrapper[4962]: E1003 13:41:24.229245 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.226967 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:41:37 crc kubenswrapper[4962]: E1003 13:41:37.227736 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.380843 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2ks6d"] Oct 03 13:41:37 crc kubenswrapper[4962]: E1003 13:41:37.381574 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" containerName="registry-server" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.381761 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" containerName="registry-server" Oct 03 13:41:37 crc kubenswrapper[4962]: E1003 13:41:37.381964 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" containerName="extract-utilities" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.382100 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" containerName="extract-utilities" Oct 03 13:41:37 crc kubenswrapper[4962]: E1003 13:41:37.382238 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" containerName="extract-content" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.382371 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" containerName="extract-content" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.383380 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4591872-3d00-4662-a7b2-622298b46b5f" containerName="registry-server" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.384968 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.397921 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ks6d"] Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.458711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-catalog-content\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.458874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdfzb\" (UniqueName: \"kubernetes.io/projected/7a027be5-3ed5-417a-a25f-0827296e23b1-kube-api-access-qdfzb\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.459011 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-utilities\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.560443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdfzb\" (UniqueName: \"kubernetes.io/projected/7a027be5-3ed5-417a-a25f-0827296e23b1-kube-api-access-qdfzb\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.560529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-utilities\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.560592 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-catalog-content\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.561042 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-catalog-content\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.561264 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-utilities\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.588669 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdfzb\" (UniqueName: \"kubernetes.io/projected/7a027be5-3ed5-417a-a25f-0827296e23b1-kube-api-access-qdfzb\") pod \"certified-operators-2ks6d\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:37 crc kubenswrapper[4962]: I1003 13:41:37.718059 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:38 crc kubenswrapper[4962]: I1003 13:41:38.247102 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ks6d"] Oct 03 13:41:38 crc kubenswrapper[4962]: W1003 13:41:38.254964 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a027be5_3ed5_417a_a25f_0827296e23b1.slice/crio-adb605b61ecaf0666bbeb19bc2c95886eb4adaa64a79e2870506409a347d0bf4 WatchSource:0}: Error finding container adb605b61ecaf0666bbeb19bc2c95886eb4adaa64a79e2870506409a347d0bf4: Status 404 returned error can't find the container with id adb605b61ecaf0666bbeb19bc2c95886eb4adaa64a79e2870506409a347d0bf4 Oct 03 13:41:38 crc kubenswrapper[4962]: I1003 13:41:38.376622 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ks6d" event={"ID":"7a027be5-3ed5-417a-a25f-0827296e23b1","Type":"ContainerStarted","Data":"adb605b61ecaf0666bbeb19bc2c95886eb4adaa64a79e2870506409a347d0bf4"} Oct 03 13:41:39 crc kubenswrapper[4962]: I1003 13:41:39.387507 4962 generic.go:334] "Generic (PLEG): container finished" podID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerID="541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a" exitCode=0 Oct 03 13:41:39 crc kubenswrapper[4962]: I1003 13:41:39.387624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ks6d" event={"ID":"7a027be5-3ed5-417a-a25f-0827296e23b1","Type":"ContainerDied","Data":"541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a"} Oct 03 13:41:41 crc kubenswrapper[4962]: I1003 13:41:41.409298 4962 generic.go:334] "Generic (PLEG): container finished" podID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerID="14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2" exitCode=0 Oct 03 13:41:41 crc kubenswrapper[4962]: I1003 13:41:41.409383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ks6d" event={"ID":"7a027be5-3ed5-417a-a25f-0827296e23b1","Type":"ContainerDied","Data":"14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2"} Oct 03 13:41:42 crc kubenswrapper[4962]: I1003 13:41:42.421095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ks6d" event={"ID":"7a027be5-3ed5-417a-a25f-0827296e23b1","Type":"ContainerStarted","Data":"db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a"} Oct 03 13:41:42 crc kubenswrapper[4962]: I1003 13:41:42.450455 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2ks6d" podStartSLOduration=3.040212987 podStartE2EDuration="5.450432065s" podCreationTimestamp="2025-10-03 13:41:37 +0000 UTC" firstStartedPulling="2025-10-03 13:41:39.389479467 +0000 UTC m=+3107.793377302" lastFinishedPulling="2025-10-03 13:41:41.799698545 +0000 UTC m=+3110.203596380" observedRunningTime="2025-10-03 13:41:42.444315071 +0000 UTC m=+3110.848212926" watchObservedRunningTime="2025-10-03 13:41:42.450432065 +0000 UTC m=+3110.854329910" Oct 03 13:41:47 crc kubenswrapper[4962]: I1003 13:41:47.719252 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:47 crc kubenswrapper[4962]: I1003 13:41:47.719738 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:47 crc kubenswrapper[4962]: I1003 13:41:47.794344 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:48 crc kubenswrapper[4962]: I1003 13:41:48.526597 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:48 crc kubenswrapper[4962]: I1003 13:41:48.580556 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2ks6d"] Oct 03 13:41:49 crc kubenswrapper[4962]: I1003 13:41:49.227140 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:41:49 crc kubenswrapper[4962]: E1003 13:41:49.227410 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:41:50 crc kubenswrapper[4962]: I1003 13:41:50.482561 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2ks6d" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerName="registry-server" containerID="cri-o://db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a" gracePeriod=2 Oct 03 13:41:50 crc kubenswrapper[4962]: I1003 13:41:50.874244 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:50 crc kubenswrapper[4962]: I1003 13:41:50.963149 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdfzb\" (UniqueName: \"kubernetes.io/projected/7a027be5-3ed5-417a-a25f-0827296e23b1-kube-api-access-qdfzb\") pod \"7a027be5-3ed5-417a-a25f-0827296e23b1\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " Oct 03 13:41:50 crc kubenswrapper[4962]: I1003 13:41:50.963252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-utilities\") pod \"7a027be5-3ed5-417a-a25f-0827296e23b1\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " Oct 03 13:41:50 crc kubenswrapper[4962]: I1003 13:41:50.963336 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-catalog-content\") pod \"7a027be5-3ed5-417a-a25f-0827296e23b1\" (UID: \"7a027be5-3ed5-417a-a25f-0827296e23b1\") " Oct 03 13:41:50 crc kubenswrapper[4962]: I1003 13:41:50.964294 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-utilities" (OuterVolumeSpecName: "utilities") pod "7a027be5-3ed5-417a-a25f-0827296e23b1" (UID: "7a027be5-3ed5-417a-a25f-0827296e23b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:41:50 crc kubenswrapper[4962]: I1003 13:41:50.972233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a027be5-3ed5-417a-a25f-0827296e23b1-kube-api-access-qdfzb" (OuterVolumeSpecName: "kube-api-access-qdfzb") pod "7a027be5-3ed5-417a-a25f-0827296e23b1" (UID: "7a027be5-3ed5-417a-a25f-0827296e23b1"). InnerVolumeSpecName "kube-api-access-qdfzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.065439 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdfzb\" (UniqueName: \"kubernetes.io/projected/7a027be5-3ed5-417a-a25f-0827296e23b1-kube-api-access-qdfzb\") on node \"crc\" DevicePath \"\"" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.065484 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.434431 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a027be5-3ed5-417a-a25f-0827296e23b1" (UID: "7a027be5-3ed5-417a-a25f-0827296e23b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.475137 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027be5-3ed5-417a-a25f-0827296e23b1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.495391 4962 generic.go:334] "Generic (PLEG): container finished" podID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerID="db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a" exitCode=0 Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.495446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ks6d" event={"ID":"7a027be5-3ed5-417a-a25f-0827296e23b1","Type":"ContainerDied","Data":"db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a"} Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.495494 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ks6d" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.495532 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ks6d" event={"ID":"7a027be5-3ed5-417a-a25f-0827296e23b1","Type":"ContainerDied","Data":"adb605b61ecaf0666bbeb19bc2c95886eb4adaa64a79e2870506409a347d0bf4"} Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.495559 4962 scope.go:117] "RemoveContainer" containerID="db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.518996 4962 scope.go:117] "RemoveContainer" containerID="14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.549597 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2ks6d"] Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.549910 4962 scope.go:117] "RemoveContainer" containerID="541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.558622 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2ks6d"] Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.585031 4962 scope.go:117] "RemoveContainer" containerID="db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a" Oct 03 13:41:51 crc kubenswrapper[4962]: E1003 13:41:51.585720 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a\": container with ID starting with db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a not found: ID does not exist" containerID="db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.585862 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a"} err="failed to get container status \"db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a\": rpc error: code = NotFound desc = could not find container \"db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a\": container with ID starting with db7d0c1e24a6712c22a6653f0f8b9721dce3aedcd18761c2262d9fa36c6d2c4a not found: ID does not exist" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.585908 4962 scope.go:117] "RemoveContainer" containerID="14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2" Oct 03 13:41:51 crc kubenswrapper[4962]: E1003 13:41:51.586415 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2\": container with ID starting with 14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2 not found: ID does not exist" containerID="14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.586493 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2"} err="failed to get container status \"14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2\": rpc error: code = NotFound desc = could not find container \"14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2\": container with ID starting with 14876d3b983e641ce748721d3b6f6a9a904de0e8b8f9a72a2ba04d6c1f7ecbf2 not found: ID does not exist" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.586530 4962 scope.go:117] "RemoveContainer" containerID="541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a" Oct 03 13:41:51 crc kubenswrapper[4962]: E1003 13:41:51.587308 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a\": container with ID starting with 541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a not found: ID does not exist" containerID="541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a" Oct 03 13:41:51 crc kubenswrapper[4962]: I1003 13:41:51.587347 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a"} err="failed to get container status \"541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a\": rpc error: code = NotFound desc = could not find container \"541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a\": container with ID starting with 541a2a046343d48246a535189962d969e42b689240556c8086a1abaf8cd4905a not found: ID does not exist" Oct 03 13:41:52 crc kubenswrapper[4962]: I1003 13:41:52.237322 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" path="/var/lib/kubelet/pods/7a027be5-3ed5-417a-a25f-0827296e23b1/volumes" Oct 03 13:42:02 crc kubenswrapper[4962]: I1003 13:42:02.230545 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:42:02 crc kubenswrapper[4962]: E1003 13:42:02.231390 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:42:13 crc kubenswrapper[4962]: I1003 13:42:13.227288 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:42:13 crc kubenswrapper[4962]: E1003 13:42:13.228259 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:42:25 crc kubenswrapper[4962]: I1003 13:42:25.227388 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:42:25 crc kubenswrapper[4962]: E1003 13:42:25.228903 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:42:38 crc kubenswrapper[4962]: I1003 13:42:38.227609 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:42:38 crc kubenswrapper[4962]: E1003 13:42:38.228726 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:42:52 crc kubenswrapper[4962]: I1003 13:42:52.233524 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:42:52 crc kubenswrapper[4962]: E1003 13:42:52.236665 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:43:06 crc kubenswrapper[4962]: I1003 13:43:06.227796 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:43:06 crc kubenswrapper[4962]: E1003 13:43:06.228581 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:43:19 crc kubenswrapper[4962]: I1003 13:43:19.227750 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:43:19 crc kubenswrapper[4962]: E1003 13:43:19.228767 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:43:34 crc kubenswrapper[4962]: I1003 13:43:34.227066 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:43:34 crc kubenswrapper[4962]: E1003 13:43:34.227687 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:43:46 crc kubenswrapper[4962]: I1003 13:43:46.227713 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:43:46 crc kubenswrapper[4962]: E1003 13:43:46.228524 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:44:00 crc kubenswrapper[4962]: I1003 13:44:00.227745 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:44:00 crc kubenswrapper[4962]: I1003 13:44:00.664230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"0d39d9f56ff63935482e31bd27280765398e7f4b806ae043c546d37ced6acc15"} Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.835609 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bdk2t"] Oct 03 13:44:22 crc kubenswrapper[4962]: E1003 13:44:22.836537 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerName="registry-server" Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.836552 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerName="registry-server" Oct 03 13:44:22 crc kubenswrapper[4962]: E1003 13:44:22.836579 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerName="extract-utilities" Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.836586 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerName="extract-utilities" Oct 03 13:44:22 crc kubenswrapper[4962]: E1003 13:44:22.836602 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerName="extract-content" Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.836609 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerName="extract-content" Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.836986 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a027be5-3ed5-417a-a25f-0827296e23b1" containerName="registry-server" Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.838185 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.860388 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdk2t"] Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.978691 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx6vh\" (UniqueName: \"kubernetes.io/projected/deb92d8d-6259-4aa1-b575-ba1e2982be8b-kube-api-access-mx6vh\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.978780 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-catalog-content\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:22 crc kubenswrapper[4962]: I1003 13:44:22.978903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-utilities\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.080223 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx6vh\" (UniqueName: \"kubernetes.io/projected/deb92d8d-6259-4aa1-b575-ba1e2982be8b-kube-api-access-mx6vh\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.080341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-catalog-content\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.080391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-utilities\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.081227 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-catalog-content\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.081233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-utilities\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.102164 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx6vh\" (UniqueName: \"kubernetes.io/projected/deb92d8d-6259-4aa1-b575-ba1e2982be8b-kube-api-access-mx6vh\") pod \"redhat-marketplace-bdk2t\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.165346 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.586727 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdk2t"] Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.853307 4962 generic.go:334] "Generic (PLEG): container finished" podID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerID="e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d" exitCode=0 Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.853765 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdk2t" event={"ID":"deb92d8d-6259-4aa1-b575-ba1e2982be8b","Type":"ContainerDied","Data":"e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d"} Oct 03 13:44:23 crc kubenswrapper[4962]: I1003 13:44:23.853800 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdk2t" event={"ID":"deb92d8d-6259-4aa1-b575-ba1e2982be8b","Type":"ContainerStarted","Data":"a48efdb1225a515fd55c48a4a4c19c7ac45bd2483380ad00ccfd464527f3d7a5"} Oct 03 13:44:25 crc kubenswrapper[4962]: I1003 13:44:25.872361 4962 generic.go:334] "Generic (PLEG): container finished" podID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerID="cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f" exitCode=0 Oct 03 13:44:25 crc kubenswrapper[4962]: I1003 13:44:25.872482 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdk2t" event={"ID":"deb92d8d-6259-4aa1-b575-ba1e2982be8b","Type":"ContainerDied","Data":"cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f"} Oct 03 13:44:26 crc kubenswrapper[4962]: I1003 13:44:26.885731 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdk2t" event={"ID":"deb92d8d-6259-4aa1-b575-ba1e2982be8b","Type":"ContainerStarted","Data":"f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7"} Oct 03 13:44:26 crc kubenswrapper[4962]: I1003 13:44:26.909167 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bdk2t" podStartSLOduration=2.405624469 podStartE2EDuration="4.909145765s" podCreationTimestamp="2025-10-03 13:44:22 +0000 UTC" firstStartedPulling="2025-10-03 13:44:23.8560915 +0000 UTC m=+3272.259989335" lastFinishedPulling="2025-10-03 13:44:26.359612796 +0000 UTC m=+3274.763510631" observedRunningTime="2025-10-03 13:44:26.907948063 +0000 UTC m=+3275.311845918" watchObservedRunningTime="2025-10-03 13:44:26.909145765 +0000 UTC m=+3275.313043600" Oct 03 13:44:33 crc kubenswrapper[4962]: I1003 13:44:33.166547 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:33 crc kubenswrapper[4962]: I1003 13:44:33.166961 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:33 crc kubenswrapper[4962]: I1003 13:44:33.231827 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:34 crc kubenswrapper[4962]: I1003 13:44:34.004835 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:34 crc kubenswrapper[4962]: I1003 13:44:34.082362 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdk2t"] Oct 03 13:44:35 crc kubenswrapper[4962]: I1003 13:44:35.946506 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bdk2t" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerName="registry-server" containerID="cri-o://f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7" gracePeriod=2 Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.324504 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.475718 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-utilities\") pod \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.475938 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-catalog-content\") pod \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.475965 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx6vh\" (UniqueName: \"kubernetes.io/projected/deb92d8d-6259-4aa1-b575-ba1e2982be8b-kube-api-access-mx6vh\") pod \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\" (UID: \"deb92d8d-6259-4aa1-b575-ba1e2982be8b\") " Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.477818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-utilities" (OuterVolumeSpecName: "utilities") pod "deb92d8d-6259-4aa1-b575-ba1e2982be8b" (UID: "deb92d8d-6259-4aa1-b575-ba1e2982be8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.481309 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb92d8d-6259-4aa1-b575-ba1e2982be8b-kube-api-access-mx6vh" (OuterVolumeSpecName: "kube-api-access-mx6vh") pod "deb92d8d-6259-4aa1-b575-ba1e2982be8b" (UID: "deb92d8d-6259-4aa1-b575-ba1e2982be8b"). InnerVolumeSpecName "kube-api-access-mx6vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.491840 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deb92d8d-6259-4aa1-b575-ba1e2982be8b" (UID: "deb92d8d-6259-4aa1-b575-ba1e2982be8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.577395 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.577711 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb92d8d-6259-4aa1-b575-ba1e2982be8b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.577728 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx6vh\" (UniqueName: \"kubernetes.io/projected/deb92d8d-6259-4aa1-b575-ba1e2982be8b-kube-api-access-mx6vh\") on node \"crc\" DevicePath \"\"" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.955152 4962 generic.go:334] "Generic (PLEG): container finished" podID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerID="f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7" exitCode=0 Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.955195 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdk2t" event={"ID":"deb92d8d-6259-4aa1-b575-ba1e2982be8b","Type":"ContainerDied","Data":"f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7"} Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.955221 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdk2t" event={"ID":"deb92d8d-6259-4aa1-b575-ba1e2982be8b","Type":"ContainerDied","Data":"a48efdb1225a515fd55c48a4a4c19c7ac45bd2483380ad00ccfd464527f3d7a5"} Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.955238 4962 scope.go:117] "RemoveContainer" containerID="f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.955357 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdk2t" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.977946 4962 scope.go:117] "RemoveContainer" containerID="cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f" Oct 03 13:44:36 crc kubenswrapper[4962]: I1003 13:44:36.998935 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdk2t"] Oct 03 13:44:37 crc kubenswrapper[4962]: I1003 13:44:37.006087 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdk2t"] Oct 03 13:44:37 crc kubenswrapper[4962]: I1003 13:44:37.016427 4962 scope.go:117] "RemoveContainer" containerID="e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d" Oct 03 13:44:37 crc kubenswrapper[4962]: I1003 13:44:37.040889 4962 scope.go:117] "RemoveContainer" containerID="f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7" Oct 03 13:44:37 crc kubenswrapper[4962]: E1003 13:44:37.041718 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7\": container with ID starting with f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7 not found: ID does not exist" containerID="f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7" Oct 03 13:44:37 crc kubenswrapper[4962]: I1003 13:44:37.041749 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7"} err="failed to get container status \"f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7\": rpc error: code = NotFound desc = could not find container \"f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7\": container with ID starting with f48876f6a68efb6f2953c04c66aa48bb17b893ee189abf80333a748ea5cb1de7 not found: ID does not exist" Oct 03 13:44:37 crc kubenswrapper[4962]: I1003 13:44:37.041807 4962 scope.go:117] "RemoveContainer" containerID="cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f" Oct 03 13:44:37 crc kubenswrapper[4962]: E1003 13:44:37.042053 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f\": container with ID starting with cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f not found: ID does not exist" containerID="cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f" Oct 03 13:44:37 crc kubenswrapper[4962]: I1003 13:44:37.042071 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f"} err="failed to get container status \"cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f\": rpc error: code = NotFound desc = could not find container \"cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f\": container with ID starting with cd619c9c792f5669991c6b000126585699d5c8b1bf46f28cbdbbabe223be841f not found: ID does not exist" Oct 03 13:44:37 crc kubenswrapper[4962]: I1003 13:44:37.042086 4962 scope.go:117] "RemoveContainer" containerID="e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d" Oct 03 13:44:37 crc kubenswrapper[4962]: E1003 13:44:37.042366 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d\": container with ID starting with e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d not found: ID does not exist" containerID="e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d" Oct 03 13:44:37 crc kubenswrapper[4962]: I1003 13:44:37.042387 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d"} err="failed to get container status \"e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d\": rpc error: code = NotFound desc = could not find container \"e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d\": container with ID starting with e2ce505818f7573631448346f83dd8d785113e436ae67d9b9f3a779166d42f1d not found: ID does not exist" Oct 03 13:44:38 crc kubenswrapper[4962]: I1003 13:44:38.236814 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" path="/var/lib/kubelet/pods/deb92d8d-6259-4aa1-b575-ba1e2982be8b/volumes" Oct 03 13:44:59 crc kubenswrapper[4962]: I1003 13:44:59.462887 4962 scope.go:117] "RemoveContainer" containerID="dc12ae6dc81c17a4c06eb51922af12c3d36b2561ad1011162e91b495703f9121" Oct 03 13:44:59 crc kubenswrapper[4962]: I1003 13:44:59.489909 4962 scope.go:117] "RemoveContainer" containerID="24b34849c076c11411c3fd5c27ccadaf427f498e7647f30cfe73b993ef22d2b5" Oct 03 13:44:59 crc kubenswrapper[4962]: I1003 13:44:59.512289 4962 scope.go:117] "RemoveContainer" containerID="b004d4535dd26e2b8b542d0c01d4673dd2b07c674907a409d9c41127c5c8117c" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.208501 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm"] Oct 03 13:45:00 crc kubenswrapper[4962]: E1003 13:45:00.208995 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerName="registry-server" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.209025 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerName="registry-server" Oct 03 13:45:00 crc kubenswrapper[4962]: E1003 13:45:00.209071 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerName="extract-utilities" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.209086 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerName="extract-utilities" Oct 03 13:45:00 crc kubenswrapper[4962]: E1003 13:45:00.209108 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerName="extract-content" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.209120 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerName="extract-content" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.209373 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb92d8d-6259-4aa1-b575-ba1e2982be8b" containerName="registry-server" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.210132 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.212537 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.217013 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.225383 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm"] Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.350092 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5925-e1cc-43f5-b67f-5e14f3909a45-secret-volume\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.350245 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5925-e1cc-43f5-b67f-5e14f3909a45-config-volume\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.350267 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8lhv\" (UniqueName: \"kubernetes.io/projected/39da5925-e1cc-43f5-b67f-5e14f3909a45-kube-api-access-m8lhv\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.451794 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5925-e1cc-43f5-b67f-5e14f3909a45-config-volume\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.451858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8lhv\" (UniqueName: \"kubernetes.io/projected/39da5925-e1cc-43f5-b67f-5e14f3909a45-kube-api-access-m8lhv\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.451902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5925-e1cc-43f5-b67f-5e14f3909a45-secret-volume\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.452783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5925-e1cc-43f5-b67f-5e14f3909a45-config-volume\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.457981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5925-e1cc-43f5-b67f-5e14f3909a45-secret-volume\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.467331 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8lhv\" (UniqueName: \"kubernetes.io/projected/39da5925-e1cc-43f5-b67f-5e14f3909a45-kube-api-access-m8lhv\") pod \"collect-profiles-29324985-hxxwm\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.546354 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:00 crc kubenswrapper[4962]: I1003 13:45:00.755927 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm"] Oct 03 13:45:01 crc kubenswrapper[4962]: I1003 13:45:01.157317 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" event={"ID":"39da5925-e1cc-43f5-b67f-5e14f3909a45","Type":"ContainerStarted","Data":"343b79e147ce221a8bf78386d77274b50b1a972c873bc997abacdb002ab5f9d1"} Oct 03 13:45:01 crc kubenswrapper[4962]: I1003 13:45:01.157672 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" event={"ID":"39da5925-e1cc-43f5-b67f-5e14f3909a45","Type":"ContainerStarted","Data":"e9caf3c544db817bd32edb7929d607c1f3d5d9b0eb8c823bbb3501cdff0dce8d"} Oct 03 13:45:02 crc kubenswrapper[4962]: I1003 13:45:02.166422 4962 generic.go:334] "Generic (PLEG): container finished" podID="39da5925-e1cc-43f5-b67f-5e14f3909a45" containerID="343b79e147ce221a8bf78386d77274b50b1a972c873bc997abacdb002ab5f9d1" exitCode=0 Oct 03 13:45:02 crc kubenswrapper[4962]: I1003 13:45:02.166491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" event={"ID":"39da5925-e1cc-43f5-b67f-5e14f3909a45","Type":"ContainerDied","Data":"343b79e147ce221a8bf78386d77274b50b1a972c873bc997abacdb002ab5f9d1"} Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.436442 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.597689 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8lhv\" (UniqueName: \"kubernetes.io/projected/39da5925-e1cc-43f5-b67f-5e14f3909a45-kube-api-access-m8lhv\") pod \"39da5925-e1cc-43f5-b67f-5e14f3909a45\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.597811 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5925-e1cc-43f5-b67f-5e14f3909a45-secret-volume\") pod \"39da5925-e1cc-43f5-b67f-5e14f3909a45\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.597977 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5925-e1cc-43f5-b67f-5e14f3909a45-config-volume\") pod \"39da5925-e1cc-43f5-b67f-5e14f3909a45\" (UID: \"39da5925-e1cc-43f5-b67f-5e14f3909a45\") " Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.598789 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39da5925-e1cc-43f5-b67f-5e14f3909a45-config-volume" (OuterVolumeSpecName: "config-volume") pod "39da5925-e1cc-43f5-b67f-5e14f3909a45" (UID: "39da5925-e1cc-43f5-b67f-5e14f3909a45"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.603148 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39da5925-e1cc-43f5-b67f-5e14f3909a45-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39da5925-e1cc-43f5-b67f-5e14f3909a45" (UID: "39da5925-e1cc-43f5-b67f-5e14f3909a45"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.603900 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39da5925-e1cc-43f5-b67f-5e14f3909a45-kube-api-access-m8lhv" (OuterVolumeSpecName: "kube-api-access-m8lhv") pod "39da5925-e1cc-43f5-b67f-5e14f3909a45" (UID: "39da5925-e1cc-43f5-b67f-5e14f3909a45"). InnerVolumeSpecName "kube-api-access-m8lhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.699657 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8lhv\" (UniqueName: \"kubernetes.io/projected/39da5925-e1cc-43f5-b67f-5e14f3909a45-kube-api-access-m8lhv\") on node \"crc\" DevicePath \"\"" Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.699687 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5925-e1cc-43f5-b67f-5e14f3909a45-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 13:45:03 crc kubenswrapper[4962]: I1003 13:45:03.699697 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5925-e1cc-43f5-b67f-5e14f3909a45-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 13:45:04 crc kubenswrapper[4962]: I1003 13:45:04.182677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" event={"ID":"39da5925-e1cc-43f5-b67f-5e14f3909a45","Type":"ContainerDied","Data":"e9caf3c544db817bd32edb7929d607c1f3d5d9b0eb8c823bbb3501cdff0dce8d"} Oct 03 13:45:04 crc kubenswrapper[4962]: I1003 13:45:04.182726 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm" Oct 03 13:45:04 crc kubenswrapper[4962]: I1003 13:45:04.182734 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9caf3c544db817bd32edb7929d607c1f3d5d9b0eb8c823bbb3501cdff0dce8d" Oct 03 13:45:04 crc kubenswrapper[4962]: I1003 13:45:04.523346 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr"] Oct 03 13:45:04 crc kubenswrapper[4962]: I1003 13:45:04.529685 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324940-khbtr"] Oct 03 13:45:06 crc kubenswrapper[4962]: I1003 13:45:06.243183 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a581eeb-fbed-4cb2-bd7c-514596ca72df" path="/var/lib/kubelet/pods/9a581eeb-fbed-4cb2-bd7c-514596ca72df/volumes" Oct 03 13:45:59 crc kubenswrapper[4962]: I1003 13:45:59.574325 4962 scope.go:117] "RemoveContainer" containerID="96d804cafd4e44896c85fefafec8011144b1e21e0c46c72f9957d9ef76fdf8f3" Oct 03 13:46:24 crc kubenswrapper[4962]: I1003 13:46:24.661562 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:46:24 crc kubenswrapper[4962]: I1003 13:46:24.662345 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:46:54 crc kubenswrapper[4962]: I1003 13:46:54.659895 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:46:54 crc kubenswrapper[4962]: I1003 13:46:54.660391 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:47:24 crc kubenswrapper[4962]: I1003 13:47:24.660182 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:47:24 crc kubenswrapper[4962]: I1003 13:47:24.661101 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:47:24 crc kubenswrapper[4962]: I1003 13:47:24.661189 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:47:24 crc kubenswrapper[4962]: I1003 13:47:24.662318 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d39d9f56ff63935482e31bd27280765398e7f4b806ae043c546d37ced6acc15"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:47:24 crc kubenswrapper[4962]: I1003 13:47:24.662404 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://0d39d9f56ff63935482e31bd27280765398e7f4b806ae043c546d37ced6acc15" gracePeriod=600 Oct 03 13:47:25 crc kubenswrapper[4962]: I1003 13:47:25.362919 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="0d39d9f56ff63935482e31bd27280765398e7f4b806ae043c546d37ced6acc15" exitCode=0 Oct 03 13:47:25 crc kubenswrapper[4962]: I1003 13:47:25.363007 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"0d39d9f56ff63935482e31bd27280765398e7f4b806ae043c546d37ced6acc15"} Oct 03 13:47:25 crc kubenswrapper[4962]: I1003 13:47:25.363821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160"} Oct 03 13:47:25 crc kubenswrapper[4962]: I1003 13:47:25.363859 4962 scope.go:117] "RemoveContainer" containerID="5f1b71cb09022d1fd0c2465c42529dafb255f5a59b613a1eef7c69bcd687fd99" Oct 03 13:49:54 crc kubenswrapper[4962]: I1003 13:49:54.660250 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:49:54 crc kubenswrapper[4962]: I1003 13:49:54.660821 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.557856 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-69g7l"] Oct 03 13:49:55 crc kubenswrapper[4962]: E1003 13:49:55.558248 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39da5925-e1cc-43f5-b67f-5e14f3909a45" containerName="collect-profiles" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.558262 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="39da5925-e1cc-43f5-b67f-5e14f3909a45" containerName="collect-profiles" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.558430 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="39da5925-e1cc-43f5-b67f-5e14f3909a45" containerName="collect-profiles" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.561462 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.579137 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69g7l"] Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.680250 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-catalog-content\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.680308 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-utilities\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.680532 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btslv\" (UniqueName: \"kubernetes.io/projected/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-kube-api-access-btslv\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.781985 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-catalog-content\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.782049 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-utilities\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.782083 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btslv\" (UniqueName: \"kubernetes.io/projected/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-kube-api-access-btslv\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.782504 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-catalog-content\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.782573 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-utilities\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.806934 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btslv\" (UniqueName: \"kubernetes.io/projected/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-kube-api-access-btslv\") pod \"redhat-operators-69g7l\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:55 crc kubenswrapper[4962]: I1003 13:49:55.881979 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:49:56 crc kubenswrapper[4962]: I1003 13:49:56.319511 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-69g7l"] Oct 03 13:49:56 crc kubenswrapper[4962]: I1003 13:49:56.787080 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerID="d5f7e2971b6c9d1edf12a025a5a6b00d0cf50abdd0878e199a8540a89fdf8dfd" exitCode=0 Oct 03 13:49:56 crc kubenswrapper[4962]: I1003 13:49:56.787422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69g7l" event={"ID":"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a","Type":"ContainerDied","Data":"d5f7e2971b6c9d1edf12a025a5a6b00d0cf50abdd0878e199a8540a89fdf8dfd"} Oct 03 13:49:56 crc kubenswrapper[4962]: I1003 13:49:56.787457 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69g7l" event={"ID":"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a","Type":"ContainerStarted","Data":"c080c0fce75fcdb5d08f198683326d92d4458f7f11d9121d008a8391a1d05400"} Oct 03 13:49:56 crc kubenswrapper[4962]: I1003 13:49:56.789550 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:49:58 crc kubenswrapper[4962]: I1003 13:49:58.807000 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerID="73f6b37dbff7b4cfccc6a90985ea58e4246d9cec5660a4d29e8e372f42b3157b" exitCode=0 Oct 03 13:49:58 crc kubenswrapper[4962]: I1003 13:49:58.807065 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69g7l" event={"ID":"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a","Type":"ContainerDied","Data":"73f6b37dbff7b4cfccc6a90985ea58e4246d9cec5660a4d29e8e372f42b3157b"} Oct 03 13:49:59 crc kubenswrapper[4962]: I1003 13:49:59.818470 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69g7l" event={"ID":"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a","Type":"ContainerStarted","Data":"aeecb90efd3a27ef52b2545ebb5cb0b45f8f8fe849cafe6f9526eb2cb6db5812"} Oct 03 13:49:59 crc kubenswrapper[4962]: I1003 13:49:59.842422 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-69g7l" podStartSLOduration=2.418159306 podStartE2EDuration="4.842402161s" podCreationTimestamp="2025-10-03 13:49:55 +0000 UTC" firstStartedPulling="2025-10-03 13:49:56.789300375 +0000 UTC m=+3605.193198220" lastFinishedPulling="2025-10-03 13:49:59.21354324 +0000 UTC m=+3607.617441075" observedRunningTime="2025-10-03 13:49:59.839978486 +0000 UTC m=+3608.243876351" watchObservedRunningTime="2025-10-03 13:49:59.842402161 +0000 UTC m=+3608.246299996" Oct 03 13:50:05 crc kubenswrapper[4962]: I1003 13:50:05.882320 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:50:05 crc kubenswrapper[4962]: I1003 13:50:05.883717 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:50:05 crc kubenswrapper[4962]: I1003 13:50:05.937541 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:50:06 crc kubenswrapper[4962]: I1003 13:50:06.941247 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:50:06 crc kubenswrapper[4962]: I1003 13:50:06.994350 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69g7l"] Oct 03 13:50:08 crc kubenswrapper[4962]: I1003 13:50:08.885282 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-69g7l" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerName="registry-server" containerID="cri-o://aeecb90efd3a27ef52b2545ebb5cb0b45f8f8fe849cafe6f9526eb2cb6db5812" gracePeriod=2 Oct 03 13:50:09 crc kubenswrapper[4962]: I1003 13:50:09.898287 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerID="aeecb90efd3a27ef52b2545ebb5cb0b45f8f8fe849cafe6f9526eb2cb6db5812" exitCode=0 Oct 03 13:50:09 crc kubenswrapper[4962]: I1003 13:50:09.898396 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69g7l" event={"ID":"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a","Type":"ContainerDied","Data":"aeecb90efd3a27ef52b2545ebb5cb0b45f8f8fe849cafe6f9526eb2cb6db5812"} Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.066431 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.211718 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-utilities\") pod \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.211758 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btslv\" (UniqueName: \"kubernetes.io/projected/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-kube-api-access-btslv\") pod \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.211814 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-catalog-content\") pod \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\" (UID: \"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a\") " Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.212909 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-utilities" (OuterVolumeSpecName: "utilities") pod "b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" (UID: "b8a3a9ed-8f34-41f7-a974-6ca9135fa96a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.219910 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-kube-api-access-btslv" (OuterVolumeSpecName: "kube-api-access-btslv") pod "b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" (UID: "b8a3a9ed-8f34-41f7-a974-6ca9135fa96a"). InnerVolumeSpecName "kube-api-access-btslv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.313211 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.313245 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btslv\" (UniqueName: \"kubernetes.io/projected/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-kube-api-access-btslv\") on node \"crc\" DevicePath \"\"" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.440318 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" (UID: "b8a3a9ed-8f34-41f7-a974-6ca9135fa96a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.516331 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.918805 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-69g7l" event={"ID":"b8a3a9ed-8f34-41f7-a974-6ca9135fa96a","Type":"ContainerDied","Data":"c080c0fce75fcdb5d08f198683326d92d4458f7f11d9121d008a8391a1d05400"} Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.918903 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-69g7l" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.919088 4962 scope.go:117] "RemoveContainer" containerID="aeecb90efd3a27ef52b2545ebb5cb0b45f8f8fe849cafe6f9526eb2cb6db5812" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.948428 4962 scope.go:117] "RemoveContainer" containerID="73f6b37dbff7b4cfccc6a90985ea58e4246d9cec5660a4d29e8e372f42b3157b" Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.959030 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-69g7l"] Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.963526 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-69g7l"] Oct 03 13:50:11 crc kubenswrapper[4962]: I1003 13:50:11.971789 4962 scope.go:117] "RemoveContainer" containerID="d5f7e2971b6c9d1edf12a025a5a6b00d0cf50abdd0878e199a8540a89fdf8dfd" Oct 03 13:50:12 crc kubenswrapper[4962]: I1003 13:50:12.236433 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" path="/var/lib/kubelet/pods/b8a3a9ed-8f34-41f7-a974-6ca9135fa96a/volumes" Oct 03 13:50:24 crc kubenswrapper[4962]: I1003 13:50:24.660363 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:50:24 crc kubenswrapper[4962]: I1003 13:50:24.660979 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:50:54 crc kubenswrapper[4962]: I1003 13:50:54.660188 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:50:54 crc kubenswrapper[4962]: I1003 13:50:54.660724 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:50:54 crc kubenswrapper[4962]: I1003 13:50:54.660777 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:50:54 crc kubenswrapper[4962]: I1003 13:50:54.661308 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:50:54 crc kubenswrapper[4962]: I1003 13:50:54.661347 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" gracePeriod=600 Oct 03 13:50:54 crc kubenswrapper[4962]: E1003 13:50:54.801990 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:50:55 crc kubenswrapper[4962]: I1003 13:50:55.293179 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" exitCode=0 Oct 03 13:50:55 crc kubenswrapper[4962]: I1003 13:50:55.293222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160"} Oct 03 13:50:55 crc kubenswrapper[4962]: I1003 13:50:55.293253 4962 scope.go:117] "RemoveContainer" containerID="0d39d9f56ff63935482e31bd27280765398e7f4b806ae043c546d37ced6acc15" Oct 03 13:50:55 crc kubenswrapper[4962]: I1003 13:50:55.294555 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:50:55 crc kubenswrapper[4962]: E1003 13:50:55.295222 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:51:06 crc kubenswrapper[4962]: I1003 13:51:06.227275 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:51:06 crc kubenswrapper[4962]: E1003 13:51:06.228137 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:51:18 crc kubenswrapper[4962]: I1003 13:51:18.228497 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:51:18 crc kubenswrapper[4962]: E1003 13:51:18.229590 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:51:31 crc kubenswrapper[4962]: I1003 13:51:31.227186 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:51:31 crc kubenswrapper[4962]: E1003 13:51:31.229038 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:51:44 crc kubenswrapper[4962]: I1003 13:51:44.227772 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:51:44 crc kubenswrapper[4962]: E1003 13:51:44.228790 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.373997 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-259ch"] Oct 03 13:51:55 crc kubenswrapper[4962]: E1003 13:51:55.374848 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerName="registry-server" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.374865 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerName="registry-server" Oct 03 13:51:55 crc kubenswrapper[4962]: E1003 13:51:55.374901 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerName="extract-content" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.374909 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerName="extract-content" Oct 03 13:51:55 crc kubenswrapper[4962]: E1003 13:51:55.374932 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerName="extract-utilities" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.374939 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerName="extract-utilities" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.375128 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a3a9ed-8f34-41f7-a974-6ca9135fa96a" containerName="registry-server" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.376430 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.384252 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-259ch"] Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.473356 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-catalog-content\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.473393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxn9z\" (UniqueName: \"kubernetes.io/projected/ba361dee-a2c2-440c-937f-1592438951af-kube-api-access-nxn9z\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.473446 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-utilities\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.574015 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-catalog-content\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.574061 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxn9z\" (UniqueName: \"kubernetes.io/projected/ba361dee-a2c2-440c-937f-1592438951af-kube-api-access-nxn9z\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.574112 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-utilities\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.574584 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-utilities\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.574606 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-catalog-content\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.597593 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxn9z\" (UniqueName: \"kubernetes.io/projected/ba361dee-a2c2-440c-937f-1592438951af-kube-api-access-nxn9z\") pod \"certified-operators-259ch\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:55 crc kubenswrapper[4962]: I1003 13:51:55.693405 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:51:56 crc kubenswrapper[4962]: I1003 13:51:56.134839 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-259ch"] Oct 03 13:51:56 crc kubenswrapper[4962]: I1003 13:51:56.872851 4962 generic.go:334] "Generic (PLEG): container finished" podID="ba361dee-a2c2-440c-937f-1592438951af" containerID="fac88afc8659ea4539109a730a8607888fae041270fa2b5fbf512f52ebb611c3" exitCode=0 Oct 03 13:51:56 crc kubenswrapper[4962]: I1003 13:51:56.872931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-259ch" event={"ID":"ba361dee-a2c2-440c-937f-1592438951af","Type":"ContainerDied","Data":"fac88afc8659ea4539109a730a8607888fae041270fa2b5fbf512f52ebb611c3"} Oct 03 13:51:56 crc kubenswrapper[4962]: I1003 13:51:56.874548 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-259ch" event={"ID":"ba361dee-a2c2-440c-937f-1592438951af","Type":"ContainerStarted","Data":"7a4cad7707d3f262783da9604d429a59f77ee641dd1de388ce18d683b43459f5"} Oct 03 13:51:57 crc kubenswrapper[4962]: I1003 13:51:57.883019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-259ch" event={"ID":"ba361dee-a2c2-440c-937f-1592438951af","Type":"ContainerStarted","Data":"01fc445c1da27e381774f4dd01d32a3d5198fd8882d8e2d4f8b29fcd82b3ec27"} Oct 03 13:51:58 crc kubenswrapper[4962]: I1003 13:51:58.227472 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:51:58 crc kubenswrapper[4962]: E1003 13:51:58.227942 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:51:58 crc kubenswrapper[4962]: I1003 13:51:58.890897 4962 generic.go:334] "Generic (PLEG): container finished" podID="ba361dee-a2c2-440c-937f-1592438951af" containerID="01fc445c1da27e381774f4dd01d32a3d5198fd8882d8e2d4f8b29fcd82b3ec27" exitCode=0 Oct 03 13:51:58 crc kubenswrapper[4962]: I1003 13:51:58.890940 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-259ch" event={"ID":"ba361dee-a2c2-440c-937f-1592438951af","Type":"ContainerDied","Data":"01fc445c1da27e381774f4dd01d32a3d5198fd8882d8e2d4f8b29fcd82b3ec27"} Oct 03 13:51:59 crc kubenswrapper[4962]: I1003 13:51:59.900210 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-259ch" event={"ID":"ba361dee-a2c2-440c-937f-1592438951af","Type":"ContainerStarted","Data":"fde9938d53f5dfe50127216e4def41fa249ceb736ae5b81ffd6ee39c611fb983"} Oct 03 13:51:59 crc kubenswrapper[4962]: I1003 13:51:59.916748 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-259ch" podStartSLOduration=2.417922901 podStartE2EDuration="4.91672372s" podCreationTimestamp="2025-10-03 13:51:55 +0000 UTC" firstStartedPulling="2025-10-03 13:51:56.87423424 +0000 UTC m=+3725.278132075" lastFinishedPulling="2025-10-03 13:51:59.373035059 +0000 UTC m=+3727.776932894" observedRunningTime="2025-10-03 13:51:59.915824356 +0000 UTC m=+3728.319722201" watchObservedRunningTime="2025-10-03 13:51:59.91672372 +0000 UTC m=+3728.320621555" Oct 03 13:52:05 crc kubenswrapper[4962]: I1003 13:52:05.694061 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:52:05 crc kubenswrapper[4962]: I1003 13:52:05.695198 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:52:05 crc kubenswrapper[4962]: I1003 13:52:05.770498 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:52:06 crc kubenswrapper[4962]: I1003 13:52:06.028854 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:52:06 crc kubenswrapper[4962]: I1003 13:52:06.109226 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-259ch"] Oct 03 13:52:07 crc kubenswrapper[4962]: I1003 13:52:07.970243 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-259ch" podUID="ba361dee-a2c2-440c-937f-1592438951af" containerName="registry-server" containerID="cri-o://fde9938d53f5dfe50127216e4def41fa249ceb736ae5b81ffd6ee39c611fb983" gracePeriod=2 Oct 03 13:52:08 crc kubenswrapper[4962]: I1003 13:52:08.989208 4962 generic.go:334] "Generic (PLEG): container finished" podID="ba361dee-a2c2-440c-937f-1592438951af" containerID="fde9938d53f5dfe50127216e4def41fa249ceb736ae5b81ffd6ee39c611fb983" exitCode=0 Oct 03 13:52:08 crc kubenswrapper[4962]: I1003 13:52:08.989428 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-259ch" event={"ID":"ba361dee-a2c2-440c-937f-1592438951af","Type":"ContainerDied","Data":"fde9938d53f5dfe50127216e4def41fa249ceb736ae5b81ffd6ee39c611fb983"} Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.436728 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.615994 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-utilities\") pod \"ba361dee-a2c2-440c-937f-1592438951af\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.616101 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-catalog-content\") pod \"ba361dee-a2c2-440c-937f-1592438951af\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.616127 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxn9z\" (UniqueName: \"kubernetes.io/projected/ba361dee-a2c2-440c-937f-1592438951af-kube-api-access-nxn9z\") pod \"ba361dee-a2c2-440c-937f-1592438951af\" (UID: \"ba361dee-a2c2-440c-937f-1592438951af\") " Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.618284 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-utilities" (OuterVolumeSpecName: "utilities") pod "ba361dee-a2c2-440c-937f-1592438951af" (UID: "ba361dee-a2c2-440c-937f-1592438951af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.625206 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba361dee-a2c2-440c-937f-1592438951af-kube-api-access-nxn9z" (OuterVolumeSpecName: "kube-api-access-nxn9z") pod "ba361dee-a2c2-440c-937f-1592438951af" (UID: "ba361dee-a2c2-440c-937f-1592438951af"). InnerVolumeSpecName "kube-api-access-nxn9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.670821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba361dee-a2c2-440c-937f-1592438951af" (UID: "ba361dee-a2c2-440c-937f-1592438951af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.718230 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxn9z\" (UniqueName: \"kubernetes.io/projected/ba361dee-a2c2-440c-937f-1592438951af-kube-api-access-nxn9z\") on node \"crc\" DevicePath \"\"" Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.718264 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:52:09 crc kubenswrapper[4962]: I1003 13:52:09.718275 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba361dee-a2c2-440c-937f-1592438951af-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:52:10 crc kubenswrapper[4962]: I1003 13:52:10.000905 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-259ch" event={"ID":"ba361dee-a2c2-440c-937f-1592438951af","Type":"ContainerDied","Data":"7a4cad7707d3f262783da9604d429a59f77ee641dd1de388ce18d683b43459f5"} Oct 03 13:52:10 crc kubenswrapper[4962]: I1003 13:52:10.000955 4962 scope.go:117] "RemoveContainer" containerID="fde9938d53f5dfe50127216e4def41fa249ceb736ae5b81ffd6ee39c611fb983" Oct 03 13:52:10 crc kubenswrapper[4962]: I1003 13:52:10.000988 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-259ch" Oct 03 13:52:10 crc kubenswrapper[4962]: I1003 13:52:10.038115 4962 scope.go:117] "RemoveContainer" containerID="01fc445c1da27e381774f4dd01d32a3d5198fd8882d8e2d4f8b29fcd82b3ec27" Oct 03 13:52:10 crc kubenswrapper[4962]: I1003 13:52:10.040578 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-259ch"] Oct 03 13:52:10 crc kubenswrapper[4962]: I1003 13:52:10.046401 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-259ch"] Oct 03 13:52:10 crc kubenswrapper[4962]: I1003 13:52:10.075325 4962 scope.go:117] "RemoveContainer" containerID="fac88afc8659ea4539109a730a8607888fae041270fa2b5fbf512f52ebb611c3" Oct 03 13:52:10 crc kubenswrapper[4962]: I1003 13:52:10.243795 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba361dee-a2c2-440c-937f-1592438951af" path="/var/lib/kubelet/pods/ba361dee-a2c2-440c-937f-1592438951af/volumes" Oct 03 13:52:12 crc kubenswrapper[4962]: I1003 13:52:12.239111 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:52:12 crc kubenswrapper[4962]: E1003 13:52:12.239749 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:52:23 crc kubenswrapper[4962]: I1003 13:52:23.227671 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:52:23 crc kubenswrapper[4962]: E1003 13:52:23.228587 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:52:36 crc kubenswrapper[4962]: I1003 13:52:36.227061 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:52:36 crc kubenswrapper[4962]: E1003 13:52:36.228142 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:52:50 crc kubenswrapper[4962]: I1003 13:52:50.226991 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:52:50 crc kubenswrapper[4962]: E1003 13:52:50.227811 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:53:05 crc kubenswrapper[4962]: I1003 13:53:05.227913 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:53:05 crc kubenswrapper[4962]: E1003 13:53:05.228695 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:53:17 crc kubenswrapper[4962]: I1003 13:53:17.226752 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:53:17 crc kubenswrapper[4962]: E1003 13:53:17.227408 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:53:32 crc kubenswrapper[4962]: I1003 13:53:32.236162 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:53:32 crc kubenswrapper[4962]: E1003 13:53:32.237441 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:53:44 crc kubenswrapper[4962]: I1003 13:53:44.227258 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:53:44 crc kubenswrapper[4962]: E1003 13:53:44.228039 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.379046 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hbckh"] Oct 03 13:53:52 crc kubenswrapper[4962]: E1003 13:53:52.379989 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba361dee-a2c2-440c-937f-1592438951af" containerName="registry-server" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.380005 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba361dee-a2c2-440c-937f-1592438951af" containerName="registry-server" Oct 03 13:53:52 crc kubenswrapper[4962]: E1003 13:53:52.380029 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba361dee-a2c2-440c-937f-1592438951af" containerName="extract-utilities" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.380037 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba361dee-a2c2-440c-937f-1592438951af" containerName="extract-utilities" Oct 03 13:53:52 crc kubenswrapper[4962]: E1003 13:53:52.380055 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba361dee-a2c2-440c-937f-1592438951af" containerName="extract-content" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.380063 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba361dee-a2c2-440c-937f-1592438951af" containerName="extract-content" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.380242 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba361dee-a2c2-440c-937f-1592438951af" containerName="registry-server" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.381459 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.407215 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hbckh"] Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.434727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhrv7\" (UniqueName: \"kubernetes.io/projected/9b7d2dca-825f-478f-aacc-1bf26c9d0626-kube-api-access-nhrv7\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.434999 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-utilities\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.435081 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-catalog-content\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.536493 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhrv7\" (UniqueName: \"kubernetes.io/projected/9b7d2dca-825f-478f-aacc-1bf26c9d0626-kube-api-access-nhrv7\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.536552 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-utilities\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.536624 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-catalog-content\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.537176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-catalog-content\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.537461 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-utilities\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.562131 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhrv7\" (UniqueName: \"kubernetes.io/projected/9b7d2dca-825f-478f-aacc-1bf26c9d0626-kube-api-access-nhrv7\") pod \"community-operators-hbckh\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:52 crc kubenswrapper[4962]: I1003 13:53:52.746815 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:53:53 crc kubenswrapper[4962]: I1003 13:53:53.178663 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hbckh"] Oct 03 13:53:53 crc kubenswrapper[4962]: I1003 13:53:53.897370 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerID="e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13" exitCode=0 Oct 03 13:53:53 crc kubenswrapper[4962]: I1003 13:53:53.897611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbckh" event={"ID":"9b7d2dca-825f-478f-aacc-1bf26c9d0626","Type":"ContainerDied","Data":"e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13"} Oct 03 13:53:53 crc kubenswrapper[4962]: I1003 13:53:53.897990 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbckh" event={"ID":"9b7d2dca-825f-478f-aacc-1bf26c9d0626","Type":"ContainerStarted","Data":"8807d8d677ed9ff0cfdfd7927b2ec7f8add026a7ec30dfcbd6f7d84442be0bd4"} Oct 03 13:53:55 crc kubenswrapper[4962]: I1003 13:53:55.226784 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:53:55 crc kubenswrapper[4962]: E1003 13:53:55.227277 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:53:55 crc kubenswrapper[4962]: I1003 13:53:55.917895 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerID="d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4" exitCode=0 Oct 03 13:53:55 crc kubenswrapper[4962]: I1003 13:53:55.917957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbckh" event={"ID":"9b7d2dca-825f-478f-aacc-1bf26c9d0626","Type":"ContainerDied","Data":"d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4"} Oct 03 13:53:56 crc kubenswrapper[4962]: I1003 13:53:56.927304 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbckh" event={"ID":"9b7d2dca-825f-478f-aacc-1bf26c9d0626","Type":"ContainerStarted","Data":"4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91"} Oct 03 13:53:56 crc kubenswrapper[4962]: I1003 13:53:56.943173 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hbckh" podStartSLOduration=2.484802017 podStartE2EDuration="4.94315536s" podCreationTimestamp="2025-10-03 13:53:52 +0000 UTC" firstStartedPulling="2025-10-03 13:53:53.901546681 +0000 UTC m=+3842.305444516" lastFinishedPulling="2025-10-03 13:53:56.359900014 +0000 UTC m=+3844.763797859" observedRunningTime="2025-10-03 13:53:56.941547786 +0000 UTC m=+3845.345445631" watchObservedRunningTime="2025-10-03 13:53:56.94315536 +0000 UTC m=+3845.347053205" Oct 03 13:54:02 crc kubenswrapper[4962]: I1003 13:54:02.747341 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:54:02 crc kubenswrapper[4962]: I1003 13:54:02.748983 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:54:02 crc kubenswrapper[4962]: I1003 13:54:02.786224 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:54:03 crc kubenswrapper[4962]: I1003 13:54:03.031001 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:54:03 crc kubenswrapper[4962]: I1003 13:54:03.774081 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hbckh"] Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.003745 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hbckh" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerName="registry-server" containerID="cri-o://4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91" gracePeriod=2 Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.388730 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.531827 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-utilities\") pod \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.531963 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhrv7\" (UniqueName: \"kubernetes.io/projected/9b7d2dca-825f-478f-aacc-1bf26c9d0626-kube-api-access-nhrv7\") pod \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.532105 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-catalog-content\") pod \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\" (UID: \"9b7d2dca-825f-478f-aacc-1bf26c9d0626\") " Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.532778 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-utilities" (OuterVolumeSpecName: "utilities") pod "9b7d2dca-825f-478f-aacc-1bf26c9d0626" (UID: "9b7d2dca-825f-478f-aacc-1bf26c9d0626"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.537716 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7d2dca-825f-478f-aacc-1bf26c9d0626-kube-api-access-nhrv7" (OuterVolumeSpecName: "kube-api-access-nhrv7") pod "9b7d2dca-825f-478f-aacc-1bf26c9d0626" (UID: "9b7d2dca-825f-478f-aacc-1bf26c9d0626"). InnerVolumeSpecName "kube-api-access-nhrv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.634036 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:54:05 crc kubenswrapper[4962]: I1003 13:54:05.634095 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhrv7\" (UniqueName: \"kubernetes.io/projected/9b7d2dca-825f-478f-aacc-1bf26c9d0626-kube-api-access-nhrv7\") on node \"crc\" DevicePath \"\"" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.022197 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerID="4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91" exitCode=0 Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.022276 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbckh" event={"ID":"9b7d2dca-825f-478f-aacc-1bf26c9d0626","Type":"ContainerDied","Data":"4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91"} Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.022331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbckh" event={"ID":"9b7d2dca-825f-478f-aacc-1bf26c9d0626","Type":"ContainerDied","Data":"8807d8d677ed9ff0cfdfd7927b2ec7f8add026a7ec30dfcbd6f7d84442be0bd4"} Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.022364 4962 scope.go:117] "RemoveContainer" containerID="4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.022392 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbckh" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.055756 4962 scope.go:117] "RemoveContainer" containerID="d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.086122 4962 scope.go:117] "RemoveContainer" containerID="e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.125016 4962 scope.go:117] "RemoveContainer" containerID="4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91" Oct 03 13:54:06 crc kubenswrapper[4962]: E1003 13:54:06.125700 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91\": container with ID starting with 4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91 not found: ID does not exist" containerID="4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.125737 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91"} err="failed to get container status \"4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91\": rpc error: code = NotFound desc = could not find container \"4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91\": container with ID starting with 4285965e1e46009249df89a73ec05e55c06900ec3826ca419e30a5f95c753a91 not found: ID does not exist" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.125761 4962 scope.go:117] "RemoveContainer" containerID="d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4" Oct 03 13:54:06 crc kubenswrapper[4962]: E1003 13:54:06.126224 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4\": container with ID starting with d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4 not found: ID does not exist" containerID="d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.126249 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4"} err="failed to get container status \"d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4\": rpc error: code = NotFound desc = could not find container \"d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4\": container with ID starting with d8e778a772301c5a008fb30c5bdc8d8624225ce0b664001b17c6a14ff84731b4 not found: ID does not exist" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.126265 4962 scope.go:117] "RemoveContainer" containerID="e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13" Oct 03 13:54:06 crc kubenswrapper[4962]: E1003 13:54:06.126819 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13\": container with ID starting with e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13 not found: ID does not exist" containerID="e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.126840 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13"} err="failed to get container status \"e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13\": rpc error: code = NotFound desc = could not find container \"e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13\": container with ID starting with e2c59f9aa34ea896d450b94140792844b2ad93e579fd4deacee365bdcf702b13 not found: ID does not exist" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.173065 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b7d2dca-825f-478f-aacc-1bf26c9d0626" (UID: "9b7d2dca-825f-478f-aacc-1bf26c9d0626"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.244661 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7d2dca-825f-478f-aacc-1bf26c9d0626-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.348966 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hbckh"] Oct 03 13:54:06 crc kubenswrapper[4962]: I1003 13:54:06.354903 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hbckh"] Oct 03 13:54:08 crc kubenswrapper[4962]: I1003 13:54:08.227877 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:54:08 crc kubenswrapper[4962]: E1003 13:54:08.228587 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:54:08 crc kubenswrapper[4962]: I1003 13:54:08.237108 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" path="/var/lib/kubelet/pods/9b7d2dca-825f-478f-aacc-1bf26c9d0626/volumes" Oct 03 13:54:21 crc kubenswrapper[4962]: I1003 13:54:21.228869 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:54:21 crc kubenswrapper[4962]: E1003 13:54:21.232118 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:54:36 crc kubenswrapper[4962]: I1003 13:54:36.227008 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:54:36 crc kubenswrapper[4962]: E1003 13:54:36.227740 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:54:51 crc kubenswrapper[4962]: I1003 13:54:51.227262 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:54:51 crc kubenswrapper[4962]: E1003 13:54:51.228173 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:55:02 crc kubenswrapper[4962]: I1003 13:55:02.230997 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:55:02 crc kubenswrapper[4962]: E1003 13:55:02.231799 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:55:13 crc kubenswrapper[4962]: I1003 13:55:13.226818 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:55:13 crc kubenswrapper[4962]: E1003 13:55:13.227501 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:55:24 crc kubenswrapper[4962]: I1003 13:55:24.228029 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:55:24 crc kubenswrapper[4962]: E1003 13:55:24.228924 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.718985 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jvvws"] Oct 03 13:55:29 crc kubenswrapper[4962]: E1003 13:55:29.719702 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerName="registry-server" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.719718 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerName="registry-server" Oct 03 13:55:29 crc kubenswrapper[4962]: E1003 13:55:29.719740 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerName="extract-content" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.719750 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerName="extract-content" Oct 03 13:55:29 crc kubenswrapper[4962]: E1003 13:55:29.719778 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerName="extract-utilities" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.719788 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerName="extract-utilities" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.719997 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7d2dca-825f-478f-aacc-1bf26c9d0626" containerName="registry-server" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.722355 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.727019 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvvws"] Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.915890 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r8bp\" (UniqueName: \"kubernetes.io/projected/8e272fbf-6c23-45ca-a367-e30d58a2237a-kube-api-access-8r8bp\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.916271 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-utilities\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:29 crc kubenswrapper[4962]: I1003 13:55:29.916353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-catalog-content\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.017342 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-catalog-content\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.017423 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r8bp\" (UniqueName: \"kubernetes.io/projected/8e272fbf-6c23-45ca-a367-e30d58a2237a-kube-api-access-8r8bp\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.017472 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-utilities\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.017810 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-catalog-content\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.017861 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-utilities\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.039504 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r8bp\" (UniqueName: \"kubernetes.io/projected/8e272fbf-6c23-45ca-a367-e30d58a2237a-kube-api-access-8r8bp\") pod \"redhat-marketplace-jvvws\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.057742 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.272833 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvvws"] Oct 03 13:55:30 crc kubenswrapper[4962]: I1003 13:55:30.717165 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvvws" event={"ID":"8e272fbf-6c23-45ca-a367-e30d58a2237a","Type":"ContainerStarted","Data":"5107df46240e55627327ec1c010ea14c759ec2aafc866dbd66d0af29183caad8"} Oct 03 13:55:31 crc kubenswrapper[4962]: I1003 13:55:31.726948 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerID="2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7" exitCode=0 Oct 03 13:55:31 crc kubenswrapper[4962]: I1003 13:55:31.726996 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvvws" event={"ID":"8e272fbf-6c23-45ca-a367-e30d58a2237a","Type":"ContainerDied","Data":"2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7"} Oct 03 13:55:31 crc kubenswrapper[4962]: I1003 13:55:31.729969 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 13:55:33 crc kubenswrapper[4962]: I1003 13:55:33.756341 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerID="9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf" exitCode=0 Oct 03 13:55:33 crc kubenswrapper[4962]: I1003 13:55:33.756454 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvvws" event={"ID":"8e272fbf-6c23-45ca-a367-e30d58a2237a","Type":"ContainerDied","Data":"9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf"} Oct 03 13:55:36 crc kubenswrapper[4962]: I1003 13:55:36.796418 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvvws" event={"ID":"8e272fbf-6c23-45ca-a367-e30d58a2237a","Type":"ContainerStarted","Data":"9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d"} Oct 03 13:55:36 crc kubenswrapper[4962]: I1003 13:55:36.824272 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jvvws" podStartSLOduration=3.7056941119999998 podStartE2EDuration="7.824243201s" podCreationTimestamp="2025-10-03 13:55:29 +0000 UTC" firstStartedPulling="2025-10-03 13:55:31.729748616 +0000 UTC m=+3940.133646451" lastFinishedPulling="2025-10-03 13:55:35.848297705 +0000 UTC m=+3944.252195540" observedRunningTime="2025-10-03 13:55:36.823024058 +0000 UTC m=+3945.226921893" watchObservedRunningTime="2025-10-03 13:55:36.824243201 +0000 UTC m=+3945.228141066" Oct 03 13:55:37 crc kubenswrapper[4962]: I1003 13:55:37.227672 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:55:37 crc kubenswrapper[4962]: E1003 13:55:37.227861 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:55:40 crc kubenswrapper[4962]: I1003 13:55:40.057873 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:40 crc kubenswrapper[4962]: I1003 13:55:40.058178 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:40 crc kubenswrapper[4962]: I1003 13:55:40.095244 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:50 crc kubenswrapper[4962]: I1003 13:55:50.100288 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:50 crc kubenswrapper[4962]: I1003 13:55:50.150837 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvvws"] Oct 03 13:55:50 crc kubenswrapper[4962]: I1003 13:55:50.227821 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:55:50 crc kubenswrapper[4962]: E1003 13:55:50.228107 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 13:55:50 crc kubenswrapper[4962]: I1003 13:55:50.915786 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jvvws" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerName="registry-server" containerID="cri-o://9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d" gracePeriod=2 Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.417085 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.424006 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-utilities\") pod \"8e272fbf-6c23-45ca-a367-e30d58a2237a\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.424063 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-catalog-content\") pod \"8e272fbf-6c23-45ca-a367-e30d58a2237a\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.424112 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r8bp\" (UniqueName: \"kubernetes.io/projected/8e272fbf-6c23-45ca-a367-e30d58a2237a-kube-api-access-8r8bp\") pod \"8e272fbf-6c23-45ca-a367-e30d58a2237a\" (UID: \"8e272fbf-6c23-45ca-a367-e30d58a2237a\") " Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.425845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-utilities" (OuterVolumeSpecName: "utilities") pod "8e272fbf-6c23-45ca-a367-e30d58a2237a" (UID: "8e272fbf-6c23-45ca-a367-e30d58a2237a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.432813 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e272fbf-6c23-45ca-a367-e30d58a2237a-kube-api-access-8r8bp" (OuterVolumeSpecName: "kube-api-access-8r8bp") pod "8e272fbf-6c23-45ca-a367-e30d58a2237a" (UID: "8e272fbf-6c23-45ca-a367-e30d58a2237a"). InnerVolumeSpecName "kube-api-access-8r8bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.444416 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e272fbf-6c23-45ca-a367-e30d58a2237a" (UID: "8e272fbf-6c23-45ca-a367-e30d58a2237a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.525838 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.525884 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r8bp\" (UniqueName: \"kubernetes.io/projected/8e272fbf-6c23-45ca-a367-e30d58a2237a-kube-api-access-8r8bp\") on node \"crc\" DevicePath \"\"" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.525900 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e272fbf-6c23-45ca-a367-e30d58a2237a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.927682 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerID="9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d" exitCode=0 Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.927812 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvvws" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.928070 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvvws" event={"ID":"8e272fbf-6c23-45ca-a367-e30d58a2237a","Type":"ContainerDied","Data":"9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d"} Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.928193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvvws" event={"ID":"8e272fbf-6c23-45ca-a367-e30d58a2237a","Type":"ContainerDied","Data":"5107df46240e55627327ec1c010ea14c759ec2aafc866dbd66d0af29183caad8"} Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.928281 4962 scope.go:117] "RemoveContainer" containerID="9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.952140 4962 scope.go:117] "RemoveContainer" containerID="9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf" Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.970071 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvvws"] Oct 03 13:55:51 crc kubenswrapper[4962]: I1003 13:55:51.976298 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvvws"] Oct 03 13:55:52 crc kubenswrapper[4962]: I1003 13:55:52.078358 4962 scope.go:117] "RemoveContainer" containerID="2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7" Oct 03 13:55:52 crc kubenswrapper[4962]: I1003 13:55:52.102255 4962 scope.go:117] "RemoveContainer" containerID="9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d" Oct 03 13:55:52 crc kubenswrapper[4962]: E1003 13:55:52.102605 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d\": container with ID starting with 9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d not found: ID does not exist" containerID="9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d" Oct 03 13:55:52 crc kubenswrapper[4962]: I1003 13:55:52.102649 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d"} err="failed to get container status \"9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d\": rpc error: code = NotFound desc = could not find container \"9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d\": container with ID starting with 9280789ea66f18eb565f83a42335793f4fe2a6a29a8d009ec8c4667fcec1403d not found: ID does not exist" Oct 03 13:55:52 crc kubenswrapper[4962]: I1003 13:55:52.102672 4962 scope.go:117] "RemoveContainer" containerID="9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf" Oct 03 13:55:52 crc kubenswrapper[4962]: E1003 13:55:52.102937 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf\": container with ID starting with 9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf not found: ID does not exist" containerID="9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf" Oct 03 13:55:52 crc kubenswrapper[4962]: I1003 13:55:52.102970 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf"} err="failed to get container status \"9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf\": rpc error: code = NotFound desc = could not find container \"9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf\": container with ID starting with 9f59d41a616a7743a3102fb589d017cfee1147fccba7c1164c0212f5c93dbebf not found: ID does not exist" Oct 03 13:55:52 crc kubenswrapper[4962]: I1003 13:55:52.102991 4962 scope.go:117] "RemoveContainer" containerID="2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7" Oct 03 13:55:52 crc kubenswrapper[4962]: E1003 13:55:52.103224 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7\": container with ID starting with 2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7 not found: ID does not exist" containerID="2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7" Oct 03 13:55:52 crc kubenswrapper[4962]: I1003 13:55:52.103248 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7"} err="failed to get container status \"2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7\": rpc error: code = NotFound desc = could not find container \"2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7\": container with ID starting with 2bab363b0c291b4b6a614a7a50408691c2ce3774767102e1979775f309639cb7 not found: ID does not exist" Oct 03 13:55:52 crc kubenswrapper[4962]: I1003 13:55:52.235354 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" path="/var/lib/kubelet/pods/8e272fbf-6c23-45ca-a367-e30d58a2237a/volumes" Oct 03 13:56:02 crc kubenswrapper[4962]: I1003 13:56:02.230536 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:56:03 crc kubenswrapper[4962]: I1003 13:56:03.012118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"8918e54f81faa5c146ad0159b1744e6c4fb2cdd86bd95a7701b5e6aa4606ec21"} Oct 03 13:58:24 crc kubenswrapper[4962]: I1003 13:58:24.659712 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:58:24 crc kubenswrapper[4962]: I1003 13:58:24.660702 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:58:54 crc kubenswrapper[4962]: I1003 13:58:54.660043 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:58:54 crc kubenswrapper[4962]: I1003 13:58:54.660865 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:59:24 crc kubenswrapper[4962]: I1003 13:59:24.664227 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 13:59:24 crc kubenswrapper[4962]: I1003 13:59:24.665377 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 13:59:24 crc kubenswrapper[4962]: I1003 13:59:24.665554 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 13:59:24 crc kubenswrapper[4962]: I1003 13:59:24.667539 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8918e54f81faa5c146ad0159b1744e6c4fb2cdd86bd95a7701b5e6aa4606ec21"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 13:59:24 crc kubenswrapper[4962]: I1003 13:59:24.667922 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://8918e54f81faa5c146ad0159b1744e6c4fb2cdd86bd95a7701b5e6aa4606ec21" gracePeriod=600 Oct 03 13:59:24 crc kubenswrapper[4962]: I1003 13:59:24.837346 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="8918e54f81faa5c146ad0159b1744e6c4fb2cdd86bd95a7701b5e6aa4606ec21" exitCode=0 Oct 03 13:59:24 crc kubenswrapper[4962]: I1003 13:59:24.837475 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"8918e54f81faa5c146ad0159b1744e6c4fb2cdd86bd95a7701b5e6aa4606ec21"} Oct 03 13:59:24 crc kubenswrapper[4962]: I1003 13:59:24.837779 4962 scope.go:117] "RemoveContainer" containerID="d8246a9247f30fd9cfdb2fa05e4bb1b8083e0fbdbebe283fd86aa9a02ba91160" Oct 03 13:59:25 crc kubenswrapper[4962]: I1003 13:59:25.848188 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4"} Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.142470 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6"] Oct 03 14:00:00 crc kubenswrapper[4962]: E1003 14:00:00.143572 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerName="registry-server" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.143589 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerName="registry-server" Oct 03 14:00:00 crc kubenswrapper[4962]: E1003 14:00:00.143613 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerName="extract-content" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.143623 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerName="extract-content" Oct 03 14:00:00 crc kubenswrapper[4962]: E1003 14:00:00.143709 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerName="extract-utilities" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.143719 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerName="extract-utilities" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.143898 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e272fbf-6c23-45ca-a367-e30d58a2237a" containerName="registry-server" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.147552 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.150454 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.150465 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.156309 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6"] Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.217433 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh5d9\" (UniqueName: \"kubernetes.io/projected/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-kube-api-access-vh5d9\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.217489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-config-volume\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.217529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-secret-volume\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.318622 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh5d9\" (UniqueName: \"kubernetes.io/projected/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-kube-api-access-vh5d9\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.318693 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-config-volume\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.318728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-secret-volume\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.320071 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-config-volume\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.332711 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-secret-volume\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.339318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh5d9\" (UniqueName: \"kubernetes.io/projected/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-kube-api-access-vh5d9\") pod \"collect-profiles-29325000-dxmn6\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:00 crc kubenswrapper[4962]: I1003 14:00:00.470680 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:01 crc kubenswrapper[4962]: I1003 14:00:01.166795 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6"] Oct 03 14:00:01 crc kubenswrapper[4962]: W1003 14:00:01.172076 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb16333f_5f61_4323_bc2f_4a394e9ad6bb.slice/crio-cff92c62c4879a277d2493b5e57623900c39c835fa3c5788f199a50871e26617 WatchSource:0}: Error finding container cff92c62c4879a277d2493b5e57623900c39c835fa3c5788f199a50871e26617: Status 404 returned error can't find the container with id cff92c62c4879a277d2493b5e57623900c39c835fa3c5788f199a50871e26617 Oct 03 14:00:02 crc kubenswrapper[4962]: I1003 14:00:02.135687 4962 generic.go:334] "Generic (PLEG): container finished" podID="fb16333f-5f61-4323-bc2f-4a394e9ad6bb" containerID="7264fb0675d236a052f6285b069ba0c322c8da031143de6d25e453a179ec09ae" exitCode=0 Oct 03 14:00:02 crc kubenswrapper[4962]: I1003 14:00:02.135737 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" event={"ID":"fb16333f-5f61-4323-bc2f-4a394e9ad6bb","Type":"ContainerDied","Data":"7264fb0675d236a052f6285b069ba0c322c8da031143de6d25e453a179ec09ae"} Oct 03 14:00:02 crc kubenswrapper[4962]: I1003 14:00:02.136016 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" event={"ID":"fb16333f-5f61-4323-bc2f-4a394e9ad6bb","Type":"ContainerStarted","Data":"cff92c62c4879a277d2493b5e57623900c39c835fa3c5788f199a50871e26617"} Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.463480 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.562544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh5d9\" (UniqueName: \"kubernetes.io/projected/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-kube-api-access-vh5d9\") pod \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.562649 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-config-volume\") pod \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.562794 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-secret-volume\") pod \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\" (UID: \"fb16333f-5f61-4323-bc2f-4a394e9ad6bb\") " Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.563761 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb16333f-5f61-4323-bc2f-4a394e9ad6bb" (UID: "fb16333f-5f61-4323-bc2f-4a394e9ad6bb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.570023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-kube-api-access-vh5d9" (OuterVolumeSpecName: "kube-api-access-vh5d9") pod "fb16333f-5f61-4323-bc2f-4a394e9ad6bb" (UID: "fb16333f-5f61-4323-bc2f-4a394e9ad6bb"). InnerVolumeSpecName "kube-api-access-vh5d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.570358 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fb16333f-5f61-4323-bc2f-4a394e9ad6bb" (UID: "fb16333f-5f61-4323-bc2f-4a394e9ad6bb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.665013 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.665075 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh5d9\" (UniqueName: \"kubernetes.io/projected/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-kube-api-access-vh5d9\") on node \"crc\" DevicePath \"\"" Oct 03 14:00:03 crc kubenswrapper[4962]: I1003 14:00:03.665090 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb16333f-5f61-4323-bc2f-4a394e9ad6bb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:00:04 crc kubenswrapper[4962]: I1003 14:00:04.154741 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" event={"ID":"fb16333f-5f61-4323-bc2f-4a394e9ad6bb","Type":"ContainerDied","Data":"cff92c62c4879a277d2493b5e57623900c39c835fa3c5788f199a50871e26617"} Oct 03 14:00:04 crc kubenswrapper[4962]: I1003 14:00:04.154802 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff92c62c4879a277d2493b5e57623900c39c835fa3c5788f199a50871e26617" Oct 03 14:00:04 crc kubenswrapper[4962]: I1003 14:00:04.154891 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6" Oct 03 14:00:04 crc kubenswrapper[4962]: I1003 14:00:04.529361 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk"] Oct 03 14:00:04 crc kubenswrapper[4962]: I1003 14:00:04.535705 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324955-9lcjk"] Oct 03 14:00:06 crc kubenswrapper[4962]: I1003 14:00:06.240622 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a089db49-8a99-46c5-99ca-732f3aa1168d" path="/var/lib/kubelet/pods/a089db49-8a99-46c5-99ca-732f3aa1168d/volumes" Oct 03 14:00:26 crc kubenswrapper[4962]: E1003 14:00:26.030336 4962 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="5.804s" Oct 03 14:00:59 crc kubenswrapper[4962]: I1003 14:00:59.880609 4962 scope.go:117] "RemoveContainer" containerID="910b846a7bb4ec67328350d581d76033e7a5b5f7605498fd753370a867f94313" Oct 03 14:01:17 crc kubenswrapper[4962]: I1003 14:01:17.990260 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ss6tq"] Oct 03 14:01:17 crc kubenswrapper[4962]: E1003 14:01:17.991238 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb16333f-5f61-4323-bc2f-4a394e9ad6bb" containerName="collect-profiles" Oct 03 14:01:17 crc kubenswrapper[4962]: I1003 14:01:17.991250 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb16333f-5f61-4323-bc2f-4a394e9ad6bb" containerName="collect-profiles" Oct 03 14:01:17 crc kubenswrapper[4962]: I1003 14:01:17.991400 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb16333f-5f61-4323-bc2f-4a394e9ad6bb" containerName="collect-profiles" Oct 03 14:01:17 crc kubenswrapper[4962]: I1003 14:01:17.992343 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.001808 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ss6tq"] Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.089961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlgmw\" (UniqueName: \"kubernetes.io/projected/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-kube-api-access-tlgmw\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.090014 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-catalog-content\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.090041 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-utilities\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.191198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlgmw\" (UniqueName: \"kubernetes.io/projected/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-kube-api-access-tlgmw\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.191277 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-catalog-content\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.191337 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-utilities\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.191977 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-utilities\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.191991 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-catalog-content\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.219531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlgmw\" (UniqueName: \"kubernetes.io/projected/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-kube-api-access-tlgmw\") pod \"redhat-operators-ss6tq\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.311418 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:18 crc kubenswrapper[4962]: I1003 14:01:18.744728 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ss6tq"] Oct 03 14:01:19 crc kubenswrapper[4962]: I1003 14:01:19.737960 4962 generic.go:334] "Generic (PLEG): container finished" podID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerID="d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c" exitCode=0 Oct 03 14:01:19 crc kubenswrapper[4962]: I1003 14:01:19.738065 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss6tq" event={"ID":"2077bc20-6d94-46cf-bbd3-b16bb96d21a5","Type":"ContainerDied","Data":"d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c"} Oct 03 14:01:19 crc kubenswrapper[4962]: I1003 14:01:19.738796 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss6tq" event={"ID":"2077bc20-6d94-46cf-bbd3-b16bb96d21a5","Type":"ContainerStarted","Data":"087e9cb93006e0a11c29270f20e9af7f61bcce6ec7bba4474ebb26f30ad9adca"} Oct 03 14:01:19 crc kubenswrapper[4962]: I1003 14:01:19.741717 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:01:21 crc kubenswrapper[4962]: I1003 14:01:21.754774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss6tq" event={"ID":"2077bc20-6d94-46cf-bbd3-b16bb96d21a5","Type":"ContainerStarted","Data":"03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924"} Oct 03 14:01:22 crc kubenswrapper[4962]: I1003 14:01:22.763763 4962 generic.go:334] "Generic (PLEG): container finished" podID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerID="03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924" exitCode=0 Oct 03 14:01:22 crc kubenswrapper[4962]: I1003 14:01:22.763800 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss6tq" event={"ID":"2077bc20-6d94-46cf-bbd3-b16bb96d21a5","Type":"ContainerDied","Data":"03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924"} Oct 03 14:01:23 crc kubenswrapper[4962]: I1003 14:01:23.773842 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss6tq" event={"ID":"2077bc20-6d94-46cf-bbd3-b16bb96d21a5","Type":"ContainerStarted","Data":"66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6"} Oct 03 14:01:23 crc kubenswrapper[4962]: I1003 14:01:23.796122 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ss6tq" podStartSLOduration=3.068006729 podStartE2EDuration="6.796101268s" podCreationTimestamp="2025-10-03 14:01:17 +0000 UTC" firstStartedPulling="2025-10-03 14:01:19.741456657 +0000 UTC m=+4288.145354492" lastFinishedPulling="2025-10-03 14:01:23.469551196 +0000 UTC m=+4291.873449031" observedRunningTime="2025-10-03 14:01:23.78986349 +0000 UTC m=+4292.193761355" watchObservedRunningTime="2025-10-03 14:01:23.796101268 +0000 UTC m=+4292.199999103" Oct 03 14:01:28 crc kubenswrapper[4962]: I1003 14:01:28.311614 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:28 crc kubenswrapper[4962]: I1003 14:01:28.311765 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:29 crc kubenswrapper[4962]: I1003 14:01:29.350501 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ss6tq" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="registry-server" probeResult="failure" output=< Oct 03 14:01:29 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Oct 03 14:01:29 crc kubenswrapper[4962]: > Oct 03 14:01:38 crc kubenswrapper[4962]: I1003 14:01:38.352329 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:38 crc kubenswrapper[4962]: I1003 14:01:38.398833 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:38 crc kubenswrapper[4962]: I1003 14:01:38.589480 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ss6tq"] Oct 03 14:01:39 crc kubenswrapper[4962]: I1003 14:01:39.925618 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ss6tq" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="registry-server" containerID="cri-o://66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6" gracePeriod=2 Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.337623 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.527265 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlgmw\" (UniqueName: \"kubernetes.io/projected/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-kube-api-access-tlgmw\") pod \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.527317 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-catalog-content\") pod \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.527373 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-utilities\") pod \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\" (UID: \"2077bc20-6d94-46cf-bbd3-b16bb96d21a5\") " Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.529024 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-utilities" (OuterVolumeSpecName: "utilities") pod "2077bc20-6d94-46cf-bbd3-b16bb96d21a5" (UID: "2077bc20-6d94-46cf-bbd3-b16bb96d21a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.533305 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-kube-api-access-tlgmw" (OuterVolumeSpecName: "kube-api-access-tlgmw") pod "2077bc20-6d94-46cf-bbd3-b16bb96d21a5" (UID: "2077bc20-6d94-46cf-bbd3-b16bb96d21a5"). InnerVolumeSpecName "kube-api-access-tlgmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.629618 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlgmw\" (UniqueName: \"kubernetes.io/projected/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-kube-api-access-tlgmw\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.629968 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.635849 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2077bc20-6d94-46cf-bbd3-b16bb96d21a5" (UID: "2077bc20-6d94-46cf-bbd3-b16bb96d21a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.731124 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2077bc20-6d94-46cf-bbd3-b16bb96d21a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.940158 4962 generic.go:334] "Generic (PLEG): container finished" podID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerID="66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6" exitCode=0 Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.940206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss6tq" event={"ID":"2077bc20-6d94-46cf-bbd3-b16bb96d21a5","Type":"ContainerDied","Data":"66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6"} Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.940211 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss6tq" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.940237 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss6tq" event={"ID":"2077bc20-6d94-46cf-bbd3-b16bb96d21a5","Type":"ContainerDied","Data":"087e9cb93006e0a11c29270f20e9af7f61bcce6ec7bba4474ebb26f30ad9adca"} Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.940257 4962 scope.go:117] "RemoveContainer" containerID="66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.976953 4962 scope.go:117] "RemoveContainer" containerID="03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924" Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.978316 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ss6tq"] Oct 03 14:01:40 crc kubenswrapper[4962]: I1003 14:01:40.985797 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ss6tq"] Oct 03 14:01:41 crc kubenswrapper[4962]: I1003 14:01:41.018957 4962 scope.go:117] "RemoveContainer" containerID="d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c" Oct 03 14:01:41 crc kubenswrapper[4962]: I1003 14:01:41.037750 4962 scope.go:117] "RemoveContainer" containerID="66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6" Oct 03 14:01:41 crc kubenswrapper[4962]: E1003 14:01:41.042196 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6\": container with ID starting with 66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6 not found: ID does not exist" containerID="66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6" Oct 03 14:01:41 crc kubenswrapper[4962]: I1003 14:01:41.042241 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6"} err="failed to get container status \"66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6\": rpc error: code = NotFound desc = could not find container \"66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6\": container with ID starting with 66677ec22e74a12f964fd082569aa5d348a4bef4b581c1d253e48370fdcf5ca6 not found: ID does not exist" Oct 03 14:01:41 crc kubenswrapper[4962]: I1003 14:01:41.042286 4962 scope.go:117] "RemoveContainer" containerID="03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924" Oct 03 14:01:41 crc kubenswrapper[4962]: E1003 14:01:41.042727 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924\": container with ID starting with 03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924 not found: ID does not exist" containerID="03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924" Oct 03 14:01:41 crc kubenswrapper[4962]: I1003 14:01:41.042788 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924"} err="failed to get container status \"03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924\": rpc error: code = NotFound desc = could not find container \"03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924\": container with ID starting with 03756888f4be9b3ac4c26dbeb26499d1d812bf32f539a1ece9fbed36db025924 not found: ID does not exist" Oct 03 14:01:41 crc kubenswrapper[4962]: I1003 14:01:41.042817 4962 scope.go:117] "RemoveContainer" containerID="d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c" Oct 03 14:01:41 crc kubenswrapper[4962]: E1003 14:01:41.043109 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c\": container with ID starting with d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c not found: ID does not exist" containerID="d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c" Oct 03 14:01:41 crc kubenswrapper[4962]: I1003 14:01:41.043144 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c"} err="failed to get container status \"d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c\": rpc error: code = NotFound desc = could not find container \"d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c\": container with ID starting with d37df5ed21c599fafd18dd27ecd82ead404c3f789034d3aac68f2ea15303b17c not found: ID does not exist" Oct 03 14:01:42 crc kubenswrapper[4962]: I1003 14:01:42.235860 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" path="/var/lib/kubelet/pods/2077bc20-6d94-46cf-bbd3-b16bb96d21a5/volumes" Oct 03 14:01:54 crc kubenswrapper[4962]: I1003 14:01:54.660447 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:01:54 crc kubenswrapper[4962]: I1003 14:01:54.661230 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:02:24 crc kubenswrapper[4962]: I1003 14:02:24.659746 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:02:24 crc kubenswrapper[4962]: I1003 14:02:24.660504 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:02:54 crc kubenswrapper[4962]: I1003 14:02:54.661988 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:02:54 crc kubenswrapper[4962]: I1003 14:02:54.662764 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:02:54 crc kubenswrapper[4962]: I1003 14:02:54.662846 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:02:54 crc kubenswrapper[4962]: I1003 14:02:54.664165 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:02:54 crc kubenswrapper[4962]: I1003 14:02:54.664272 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" gracePeriod=600 Oct 03 14:02:55 crc kubenswrapper[4962]: E1003 14:02:55.116396 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:02:55 crc kubenswrapper[4962]: I1003 14:02:55.512489 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" exitCode=0 Oct 03 14:02:55 crc kubenswrapper[4962]: I1003 14:02:55.512543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4"} Oct 03 14:02:55 crc kubenswrapper[4962]: I1003 14:02:55.512578 4962 scope.go:117] "RemoveContainer" containerID="8918e54f81faa5c146ad0159b1744e6c4fb2cdd86bd95a7701b5e6aa4606ec21" Oct 03 14:02:55 crc kubenswrapper[4962]: I1003 14:02:55.513171 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:02:55 crc kubenswrapper[4962]: E1003 14:02:55.513466 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:03:07 crc kubenswrapper[4962]: I1003 14:03:07.227448 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:03:07 crc kubenswrapper[4962]: E1003 14:03:07.229076 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.135761 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-229r7"] Oct 03 14:03:15 crc kubenswrapper[4962]: E1003 14:03:15.136877 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="extract-utilities" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.136895 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="extract-utilities" Oct 03 14:03:15 crc kubenswrapper[4962]: E1003 14:03:15.136918 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="registry-server" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.136928 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="registry-server" Oct 03 14:03:15 crc kubenswrapper[4962]: E1003 14:03:15.136946 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="extract-content" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.136953 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="extract-content" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.137188 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2077bc20-6d94-46cf-bbd3-b16bb96d21a5" containerName="registry-server" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.138502 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.152358 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-229r7"] Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.285299 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d2sn\" (UniqueName: \"kubernetes.io/projected/d6142bd1-bb76-4541-ac25-265c11fa6ea7-kube-api-access-9d2sn\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.285339 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-catalog-content\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.285356 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-utilities\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.386325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d2sn\" (UniqueName: \"kubernetes.io/projected/d6142bd1-bb76-4541-ac25-265c11fa6ea7-kube-api-access-9d2sn\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.386378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-catalog-content\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.386399 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-utilities\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.386838 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-catalog-content\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.386866 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-utilities\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.420864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d2sn\" (UniqueName: \"kubernetes.io/projected/d6142bd1-bb76-4541-ac25-265c11fa6ea7-kube-api-access-9d2sn\") pod \"certified-operators-229r7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.468296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:15 crc kubenswrapper[4962]: I1003 14:03:15.952546 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-229r7"] Oct 03 14:03:15 crc kubenswrapper[4962]: W1003 14:03:15.961079 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6142bd1_bb76_4541_ac25_265c11fa6ea7.slice/crio-a5cb06766ce172e57beaee000e263579ca8924f6c5bf0b20256eeb3b60a873c1 WatchSource:0}: Error finding container a5cb06766ce172e57beaee000e263579ca8924f6c5bf0b20256eeb3b60a873c1: Status 404 returned error can't find the container with id a5cb06766ce172e57beaee000e263579ca8924f6c5bf0b20256eeb3b60a873c1 Oct 03 14:03:16 crc kubenswrapper[4962]: I1003 14:03:16.706648 4962 generic.go:334] "Generic (PLEG): container finished" podID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerID="be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05" exitCode=0 Oct 03 14:03:16 crc kubenswrapper[4962]: I1003 14:03:16.706714 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-229r7" event={"ID":"d6142bd1-bb76-4541-ac25-265c11fa6ea7","Type":"ContainerDied","Data":"be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05"} Oct 03 14:03:16 crc kubenswrapper[4962]: I1003 14:03:16.706754 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-229r7" event={"ID":"d6142bd1-bb76-4541-ac25-265c11fa6ea7","Type":"ContainerStarted","Data":"a5cb06766ce172e57beaee000e263579ca8924f6c5bf0b20256eeb3b60a873c1"} Oct 03 14:03:19 crc kubenswrapper[4962]: I1003 14:03:19.736269 4962 generic.go:334] "Generic (PLEG): container finished" podID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerID="1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33" exitCode=0 Oct 03 14:03:19 crc kubenswrapper[4962]: I1003 14:03:19.736362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-229r7" event={"ID":"d6142bd1-bb76-4541-ac25-265c11fa6ea7","Type":"ContainerDied","Data":"1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33"} Oct 03 14:03:22 crc kubenswrapper[4962]: I1003 14:03:22.231283 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:03:22 crc kubenswrapper[4962]: E1003 14:03:22.232144 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:03:23 crc kubenswrapper[4962]: I1003 14:03:23.769993 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-229r7" event={"ID":"d6142bd1-bb76-4541-ac25-265c11fa6ea7","Type":"ContainerStarted","Data":"243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f"} Oct 03 14:03:23 crc kubenswrapper[4962]: I1003 14:03:23.794444 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-229r7" podStartSLOduration=2.461738783 podStartE2EDuration="8.794428957s" podCreationTimestamp="2025-10-03 14:03:15 +0000 UTC" firstStartedPulling="2025-10-03 14:03:16.712955355 +0000 UTC m=+4405.116853190" lastFinishedPulling="2025-10-03 14:03:23.045645529 +0000 UTC m=+4411.449543364" observedRunningTime="2025-10-03 14:03:23.791545419 +0000 UTC m=+4412.195443254" watchObservedRunningTime="2025-10-03 14:03:23.794428957 +0000 UTC m=+4412.198326792" Oct 03 14:03:25 crc kubenswrapper[4962]: I1003 14:03:25.468878 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:25 crc kubenswrapper[4962]: I1003 14:03:25.469732 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:25 crc kubenswrapper[4962]: I1003 14:03:25.533317 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:33 crc kubenswrapper[4962]: I1003 14:03:33.227861 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:03:33 crc kubenswrapper[4962]: E1003 14:03:33.229324 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:03:35 crc kubenswrapper[4962]: I1003 14:03:35.520690 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:35 crc kubenswrapper[4962]: I1003 14:03:35.561101 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-229r7"] Oct 03 14:03:35 crc kubenswrapper[4962]: I1003 14:03:35.863566 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-229r7" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerName="registry-server" containerID="cri-o://243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f" gracePeriod=2 Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.302944 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.424735 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d2sn\" (UniqueName: \"kubernetes.io/projected/d6142bd1-bb76-4541-ac25-265c11fa6ea7-kube-api-access-9d2sn\") pod \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.424918 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-catalog-content\") pod \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.424949 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-utilities\") pod \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\" (UID: \"d6142bd1-bb76-4541-ac25-265c11fa6ea7\") " Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.425973 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-utilities" (OuterVolumeSpecName: "utilities") pod "d6142bd1-bb76-4541-ac25-265c11fa6ea7" (UID: "d6142bd1-bb76-4541-ac25-265c11fa6ea7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.431628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6142bd1-bb76-4541-ac25-265c11fa6ea7-kube-api-access-9d2sn" (OuterVolumeSpecName: "kube-api-access-9d2sn") pod "d6142bd1-bb76-4541-ac25-265c11fa6ea7" (UID: "d6142bd1-bb76-4541-ac25-265c11fa6ea7"). InnerVolumeSpecName "kube-api-access-9d2sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.470250 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6142bd1-bb76-4541-ac25-265c11fa6ea7" (UID: "d6142bd1-bb76-4541-ac25-265c11fa6ea7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.526838 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.526879 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6142bd1-bb76-4541-ac25-265c11fa6ea7-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.526895 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d2sn\" (UniqueName: \"kubernetes.io/projected/d6142bd1-bb76-4541-ac25-265c11fa6ea7-kube-api-access-9d2sn\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.872464 4962 generic.go:334] "Generic (PLEG): container finished" podID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerID="243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f" exitCode=0 Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.872510 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-229r7" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.872513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-229r7" event={"ID":"d6142bd1-bb76-4541-ac25-265c11fa6ea7","Type":"ContainerDied","Data":"243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f"} Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.872632 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-229r7" event={"ID":"d6142bd1-bb76-4541-ac25-265c11fa6ea7","Type":"ContainerDied","Data":"a5cb06766ce172e57beaee000e263579ca8924f6c5bf0b20256eeb3b60a873c1"} Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.872706 4962 scope.go:117] "RemoveContainer" containerID="243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.893332 4962 scope.go:117] "RemoveContainer" containerID="1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.905750 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-229r7"] Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.910328 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-229r7"] Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.924191 4962 scope.go:117] "RemoveContainer" containerID="be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.943246 4962 scope.go:117] "RemoveContainer" containerID="243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f" Oct 03 14:03:36 crc kubenswrapper[4962]: E1003 14:03:36.943699 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f\": container with ID starting with 243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f not found: ID does not exist" containerID="243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.943731 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f"} err="failed to get container status \"243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f\": rpc error: code = NotFound desc = could not find container \"243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f\": container with ID starting with 243722273baeb3fe77abd87cf1c90318ca10b9bd2375e3e9514b69d6de64fe0f not found: ID does not exist" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.943751 4962 scope.go:117] "RemoveContainer" containerID="1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33" Oct 03 14:03:36 crc kubenswrapper[4962]: E1003 14:03:36.943956 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33\": container with ID starting with 1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33 not found: ID does not exist" containerID="1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.943977 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33"} err="failed to get container status \"1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33\": rpc error: code = NotFound desc = could not find container \"1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33\": container with ID starting with 1537d8145d45721f10a3d6e9ce3bb410f12d94969418adeb62bd394f5849fd33 not found: ID does not exist" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.943989 4962 scope.go:117] "RemoveContainer" containerID="be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05" Oct 03 14:03:36 crc kubenswrapper[4962]: E1003 14:03:36.944160 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05\": container with ID starting with be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05 not found: ID does not exist" containerID="be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05" Oct 03 14:03:36 crc kubenswrapper[4962]: I1003 14:03:36.944174 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05"} err="failed to get container status \"be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05\": rpc error: code = NotFound desc = could not find container \"be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05\": container with ID starting with be390d79e7c68df87733126350d29aedb48efefa149de29af777af64e932fc05 not found: ID does not exist" Oct 03 14:03:38 crc kubenswrapper[4962]: I1003 14:03:38.255040 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" path="/var/lib/kubelet/pods/d6142bd1-bb76-4541-ac25-265c11fa6ea7/volumes" Oct 03 14:03:44 crc kubenswrapper[4962]: I1003 14:03:44.227614 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:03:44 crc kubenswrapper[4962]: E1003 14:03:44.230231 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:03:55 crc kubenswrapper[4962]: I1003 14:03:55.226992 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:03:55 crc kubenswrapper[4962]: E1003 14:03:55.228301 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:04:10 crc kubenswrapper[4962]: I1003 14:04:10.226887 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:04:10 crc kubenswrapper[4962]: E1003 14:04:10.227962 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:04:24 crc kubenswrapper[4962]: I1003 14:04:24.229412 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:04:24 crc kubenswrapper[4962]: E1003 14:04:24.231068 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:04:35 crc kubenswrapper[4962]: I1003 14:04:35.226996 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:04:35 crc kubenswrapper[4962]: E1003 14:04:35.227778 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.520033 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2tbbw"] Oct 03 14:04:37 crc kubenswrapper[4962]: E1003 14:04:37.520655 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerName="extract-utilities" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.520668 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerName="extract-utilities" Oct 03 14:04:37 crc kubenswrapper[4962]: E1003 14:04:37.520682 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerName="extract-content" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.520688 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerName="extract-content" Oct 03 14:04:37 crc kubenswrapper[4962]: E1003 14:04:37.520698 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerName="registry-server" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.520704 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerName="registry-server" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.520840 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6142bd1-bb76-4541-ac25-265c11fa6ea7" containerName="registry-server" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.521811 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.536855 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tbbw"] Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.601061 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-utilities\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.601223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7bnc\" (UniqueName: \"kubernetes.io/projected/11330131-729d-4863-9a58-a0f28cb509e8-kube-api-access-h7bnc\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.601316 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-catalog-content\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.702898 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-utilities\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.702997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7bnc\" (UniqueName: \"kubernetes.io/projected/11330131-729d-4863-9a58-a0f28cb509e8-kube-api-access-h7bnc\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.703058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-catalog-content\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.703663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-catalog-content\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.703978 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-utilities\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.730939 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7bnc\" (UniqueName: \"kubernetes.io/projected/11330131-729d-4863-9a58-a0f28cb509e8-kube-api-access-h7bnc\") pod \"community-operators-2tbbw\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:37 crc kubenswrapper[4962]: I1003 14:04:37.844381 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:38 crc kubenswrapper[4962]: I1003 14:04:38.288878 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tbbw"] Oct 03 14:04:38 crc kubenswrapper[4962]: I1003 14:04:38.344505 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tbbw" event={"ID":"11330131-729d-4863-9a58-a0f28cb509e8","Type":"ContainerStarted","Data":"589c020915ef3f85012bd7730e21eec54ac9e807cc42f5f0793e48f76e78863e"} Oct 03 14:04:39 crc kubenswrapper[4962]: I1003 14:04:39.352832 4962 generic.go:334] "Generic (PLEG): container finished" podID="11330131-729d-4863-9a58-a0f28cb509e8" containerID="466d8db9494d0a4c6418b4044b8318af2aa0afbd7b5f19c05f1e6f1776f2f5a7" exitCode=0 Oct 03 14:04:39 crc kubenswrapper[4962]: I1003 14:04:39.353225 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tbbw" event={"ID":"11330131-729d-4863-9a58-a0f28cb509e8","Type":"ContainerDied","Data":"466d8db9494d0a4c6418b4044b8318af2aa0afbd7b5f19c05f1e6f1776f2f5a7"} Oct 03 14:04:43 crc kubenswrapper[4962]: I1003 14:04:43.390130 4962 generic.go:334] "Generic (PLEG): container finished" podID="11330131-729d-4863-9a58-a0f28cb509e8" containerID="b416b704f10447b4bca52d0236913e655a8f9e26244fb7aaf9b50aa6f193b201" exitCode=0 Oct 03 14:04:43 crc kubenswrapper[4962]: I1003 14:04:43.391074 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tbbw" event={"ID":"11330131-729d-4863-9a58-a0f28cb509e8","Type":"ContainerDied","Data":"b416b704f10447b4bca52d0236913e655a8f9e26244fb7aaf9b50aa6f193b201"} Oct 03 14:04:45 crc kubenswrapper[4962]: I1003 14:04:45.417199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tbbw" event={"ID":"11330131-729d-4863-9a58-a0f28cb509e8","Type":"ContainerStarted","Data":"db5207af24b6034978efee5ba8af7593c075d75fd63d655265e08506d1e8ddfd"} Oct 03 14:04:45 crc kubenswrapper[4962]: I1003 14:04:45.447998 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2tbbw" podStartSLOduration=3.562551648 podStartE2EDuration="8.447981254s" podCreationTimestamp="2025-10-03 14:04:37 +0000 UTC" firstStartedPulling="2025-10-03 14:04:39.354943278 +0000 UTC m=+4487.758841113" lastFinishedPulling="2025-10-03 14:04:44.240372884 +0000 UTC m=+4492.644270719" observedRunningTime="2025-10-03 14:04:45.442370394 +0000 UTC m=+4493.846268259" watchObservedRunningTime="2025-10-03 14:04:45.447981254 +0000 UTC m=+4493.851879079" Oct 03 14:04:47 crc kubenswrapper[4962]: I1003 14:04:47.227336 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:04:47 crc kubenswrapper[4962]: E1003 14:04:47.227594 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:04:47 crc kubenswrapper[4962]: I1003 14:04:47.845373 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:47 crc kubenswrapper[4962]: I1003 14:04:47.845432 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:47 crc kubenswrapper[4962]: I1003 14:04:47.909060 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:57 crc kubenswrapper[4962]: I1003 14:04:57.891499 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:57 crc kubenswrapper[4962]: I1003 14:04:57.943277 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2tbbw"] Oct 03 14:04:58 crc kubenswrapper[4962]: I1003 14:04:58.518215 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2tbbw" podUID="11330131-729d-4863-9a58-a0f28cb509e8" containerName="registry-server" containerID="cri-o://db5207af24b6034978efee5ba8af7593c075d75fd63d655265e08506d1e8ddfd" gracePeriod=2 Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.529298 4962 generic.go:334] "Generic (PLEG): container finished" podID="11330131-729d-4863-9a58-a0f28cb509e8" containerID="db5207af24b6034978efee5ba8af7593c075d75fd63d655265e08506d1e8ddfd" exitCode=0 Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.529376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tbbw" event={"ID":"11330131-729d-4863-9a58-a0f28cb509e8","Type":"ContainerDied","Data":"db5207af24b6034978efee5ba8af7593c075d75fd63d655265e08506d1e8ddfd"} Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.631718 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.647098 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-catalog-content\") pod \"11330131-729d-4863-9a58-a0f28cb509e8\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.647154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-utilities\") pod \"11330131-729d-4863-9a58-a0f28cb509e8\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.647201 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7bnc\" (UniqueName: \"kubernetes.io/projected/11330131-729d-4863-9a58-a0f28cb509e8-kube-api-access-h7bnc\") pod \"11330131-729d-4863-9a58-a0f28cb509e8\" (UID: \"11330131-729d-4863-9a58-a0f28cb509e8\") " Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.648837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-utilities" (OuterVolumeSpecName: "utilities") pod "11330131-729d-4863-9a58-a0f28cb509e8" (UID: "11330131-729d-4863-9a58-a0f28cb509e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.663124 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11330131-729d-4863-9a58-a0f28cb509e8-kube-api-access-h7bnc" (OuterVolumeSpecName: "kube-api-access-h7bnc") pod "11330131-729d-4863-9a58-a0f28cb509e8" (UID: "11330131-729d-4863-9a58-a0f28cb509e8"). InnerVolumeSpecName "kube-api-access-h7bnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.706555 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11330131-729d-4863-9a58-a0f28cb509e8" (UID: "11330131-729d-4863-9a58-a0f28cb509e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.748404 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.748440 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11330131-729d-4863-9a58-a0f28cb509e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:59 crc kubenswrapper[4962]: I1003 14:04:59.748452 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7bnc\" (UniqueName: \"kubernetes.io/projected/11330131-729d-4863-9a58-a0f28cb509e8-kube-api-access-h7bnc\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:00 crc kubenswrapper[4962]: I1003 14:05:00.541679 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tbbw" event={"ID":"11330131-729d-4863-9a58-a0f28cb509e8","Type":"ContainerDied","Data":"589c020915ef3f85012bd7730e21eec54ac9e807cc42f5f0793e48f76e78863e"} Oct 03 14:05:00 crc kubenswrapper[4962]: I1003 14:05:00.541784 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tbbw" Oct 03 14:05:00 crc kubenswrapper[4962]: I1003 14:05:00.542078 4962 scope.go:117] "RemoveContainer" containerID="db5207af24b6034978efee5ba8af7593c075d75fd63d655265e08506d1e8ddfd" Oct 03 14:05:00 crc kubenswrapper[4962]: I1003 14:05:00.568677 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2tbbw"] Oct 03 14:05:00 crc kubenswrapper[4962]: I1003 14:05:00.573471 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2tbbw"] Oct 03 14:05:00 crc kubenswrapper[4962]: I1003 14:05:00.577434 4962 scope.go:117] "RemoveContainer" containerID="b416b704f10447b4bca52d0236913e655a8f9e26244fb7aaf9b50aa6f193b201" Oct 03 14:05:00 crc kubenswrapper[4962]: I1003 14:05:00.610998 4962 scope.go:117] "RemoveContainer" containerID="466d8db9494d0a4c6418b4044b8318af2aa0afbd7b5f19c05f1e6f1776f2f5a7" Oct 03 14:05:01 crc kubenswrapper[4962]: I1003 14:05:01.227858 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:05:01 crc kubenswrapper[4962]: E1003 14:05:01.228212 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:05:02 crc kubenswrapper[4962]: I1003 14:05:02.241004 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11330131-729d-4863-9a58-a0f28cb509e8" path="/var/lib/kubelet/pods/11330131-729d-4863-9a58-a0f28cb509e8/volumes" Oct 03 14:05:14 crc kubenswrapper[4962]: I1003 14:05:14.227016 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:05:14 crc kubenswrapper[4962]: E1003 14:05:14.228758 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:05:25 crc kubenswrapper[4962]: I1003 14:05:25.227172 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:05:25 crc kubenswrapper[4962]: E1003 14:05:25.227789 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:05:39 crc kubenswrapper[4962]: I1003 14:05:39.227495 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:05:39 crc kubenswrapper[4962]: E1003 14:05:39.228577 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:05:51 crc kubenswrapper[4962]: I1003 14:05:51.227299 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:05:51 crc kubenswrapper[4962]: E1003 14:05:51.228713 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:06:04 crc kubenswrapper[4962]: I1003 14:06:04.227239 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:06:04 crc kubenswrapper[4962]: E1003 14:06:04.227953 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:06:17 crc kubenswrapper[4962]: I1003 14:06:17.228180 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:06:17 crc kubenswrapper[4962]: E1003 14:06:17.229892 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:06:32 crc kubenswrapper[4962]: I1003 14:06:32.230265 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:06:32 crc kubenswrapper[4962]: E1003 14:06:32.231192 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:06:44 crc kubenswrapper[4962]: I1003 14:06:44.227761 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:06:44 crc kubenswrapper[4962]: E1003 14:06:44.230913 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:06:58 crc kubenswrapper[4962]: I1003 14:06:58.227450 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:06:58 crc kubenswrapper[4962]: E1003 14:06:58.228388 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:07:13 crc kubenswrapper[4962]: I1003 14:07:13.227981 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:07:13 crc kubenswrapper[4962]: E1003 14:07:13.228946 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:07:24 crc kubenswrapper[4962]: I1003 14:07:24.228630 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:07:24 crc kubenswrapper[4962]: E1003 14:07:24.229914 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:07:37 crc kubenswrapper[4962]: I1003 14:07:37.229165 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:07:37 crc kubenswrapper[4962]: E1003 14:07:37.230234 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:07:50 crc kubenswrapper[4962]: I1003 14:07:50.228988 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:07:50 crc kubenswrapper[4962]: E1003 14:07:50.230154 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:08:01 crc kubenswrapper[4962]: I1003 14:08:01.228055 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:08:02 crc kubenswrapper[4962]: I1003 14:08:02.069468 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"cbb6e91cb75c7a810a75bc675a8082724b8bbf2a8a48579c538b63a351ed5feb"} Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.043778 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-5j546"] Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.049353 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-5j546"] Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.218956 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qhpqx"] Oct 03 14:08:36 crc kubenswrapper[4962]: E1003 14:08:36.219286 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11330131-729d-4863-9a58-a0f28cb509e8" containerName="extract-utilities" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.219313 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="11330131-729d-4863-9a58-a0f28cb509e8" containerName="extract-utilities" Oct 03 14:08:36 crc kubenswrapper[4962]: E1003 14:08:36.219329 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11330131-729d-4863-9a58-a0f28cb509e8" containerName="registry-server" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.219338 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="11330131-729d-4863-9a58-a0f28cb509e8" containerName="registry-server" Oct 03 14:08:36 crc kubenswrapper[4962]: E1003 14:08:36.219367 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11330131-729d-4863-9a58-a0f28cb509e8" containerName="extract-content" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.219377 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="11330131-729d-4863-9a58-a0f28cb509e8" containerName="extract-content" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.219622 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="11330131-729d-4863-9a58-a0f28cb509e8" containerName="registry-server" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.220231 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.226007 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.226273 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.226523 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.226838 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-v8h8n" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.241727 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc5c3ec-d428-4310-8e25-cc8e5416fb08" path="/var/lib/kubelet/pods/ddc5c3ec-d428-4310-8e25-cc8e5416fb08/volumes" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.243043 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qhpqx"] Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.254876 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-node-mnt\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.254945 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-crc-storage\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.254975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzj8\" (UniqueName: \"kubernetes.io/projected/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-kube-api-access-bzzj8\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.356058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-node-mnt\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.356125 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-crc-storage\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.356153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzj8\" (UniqueName: \"kubernetes.io/projected/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-kube-api-access-bzzj8\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.356808 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-node-mnt\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.357869 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-crc-storage\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.392310 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzj8\" (UniqueName: \"kubernetes.io/projected/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-kube-api-access-bzzj8\") pod \"crc-storage-crc-qhpqx\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.540569 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.947743 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qhpqx"] Oct 03 14:08:36 crc kubenswrapper[4962]: I1003 14:08:36.956678 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:08:37 crc kubenswrapper[4962]: I1003 14:08:37.379570 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qhpqx" event={"ID":"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66","Type":"ContainerStarted","Data":"18d0e67ae12c5bef77918af229e09a13e28346e9cd2ddbc6358db7c86c3b5d92"} Oct 03 14:08:38 crc kubenswrapper[4962]: I1003 14:08:38.406906 4962 generic.go:334] "Generic (PLEG): container finished" podID="fd18cf05-a17f-4beb-9ffa-deab9b6b3e66" containerID="1cb4130e4e19a9663f3fd738929f6a3f9dc1c390dbb07524ecc3218c81df8ed3" exitCode=0 Oct 03 14:08:38 crc kubenswrapper[4962]: I1003 14:08:38.407006 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qhpqx" event={"ID":"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66","Type":"ContainerDied","Data":"1cb4130e4e19a9663f3fd738929f6a3f9dc1c390dbb07524ecc3218c81df8ed3"} Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.711668 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.811678 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-node-mnt\") pod \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.811721 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fd18cf05-a17f-4beb-9ffa-deab9b6b3e66" (UID: "fd18cf05-a17f-4beb-9ffa-deab9b6b3e66"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.812148 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzzj8\" (UniqueName: \"kubernetes.io/projected/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-kube-api-access-bzzj8\") pod \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.812273 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-crc-storage\") pod \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\" (UID: \"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66\") " Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.812560 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.818053 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-kube-api-access-bzzj8" (OuterVolumeSpecName: "kube-api-access-bzzj8") pod "fd18cf05-a17f-4beb-9ffa-deab9b6b3e66" (UID: "fd18cf05-a17f-4beb-9ffa-deab9b6b3e66"). InnerVolumeSpecName "kube-api-access-bzzj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.830232 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fd18cf05-a17f-4beb-9ffa-deab9b6b3e66" (UID: "fd18cf05-a17f-4beb-9ffa-deab9b6b3e66"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.914825 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzzj8\" (UniqueName: \"kubernetes.io/projected/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-kube-api-access-bzzj8\") on node \"crc\" DevicePath \"\"" Oct 03 14:08:39 crc kubenswrapper[4962]: I1003 14:08:39.914909 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 14:08:40 crc kubenswrapper[4962]: I1003 14:08:40.424297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qhpqx" event={"ID":"fd18cf05-a17f-4beb-9ffa-deab9b6b3e66","Type":"ContainerDied","Data":"18d0e67ae12c5bef77918af229e09a13e28346e9cd2ddbc6358db7c86c3b5d92"} Oct 03 14:08:40 crc kubenswrapper[4962]: I1003 14:08:40.424554 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d0e67ae12c5bef77918af229e09a13e28346e9cd2ddbc6358db7c86c3b5d92" Oct 03 14:08:40 crc kubenswrapper[4962]: I1003 14:08:40.424355 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qhpqx" Oct 03 14:08:41 crc kubenswrapper[4962]: I1003 14:08:41.916483 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-qhpqx"] Oct 03 14:08:41 crc kubenswrapper[4962]: I1003 14:08:41.926763 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-qhpqx"] Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.029032 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-pgxks"] Oct 03 14:08:42 crc kubenswrapper[4962]: E1003 14:08:42.029412 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd18cf05-a17f-4beb-9ffa-deab9b6b3e66" containerName="storage" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.029434 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd18cf05-a17f-4beb-9ffa-deab9b6b3e66" containerName="storage" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.029590 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd18cf05-a17f-4beb-9ffa-deab9b6b3e66" containerName="storage" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.030272 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.032805 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.032846 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-v8h8n" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.032828 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.033079 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.043587 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pgxks"] Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.044827 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/db4ef18b-0548-463a-97d2-f344e26159d0-crc-storage\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.044990 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnwpg\" (UniqueName: \"kubernetes.io/projected/db4ef18b-0548-463a-97d2-f344e26159d0-kube-api-access-mnwpg\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.045187 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/db4ef18b-0548-463a-97d2-f344e26159d0-node-mnt\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.146336 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnwpg\" (UniqueName: \"kubernetes.io/projected/db4ef18b-0548-463a-97d2-f344e26159d0-kube-api-access-mnwpg\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.146922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/db4ef18b-0548-463a-97d2-f344e26159d0-node-mnt\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.147163 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/db4ef18b-0548-463a-97d2-f344e26159d0-crc-storage\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.147204 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/db4ef18b-0548-463a-97d2-f344e26159d0-node-mnt\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.147828 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/db4ef18b-0548-463a-97d2-f344e26159d0-crc-storage\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.185039 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnwpg\" (UniqueName: \"kubernetes.io/projected/db4ef18b-0548-463a-97d2-f344e26159d0-kube-api-access-mnwpg\") pod \"crc-storage-crc-pgxks\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.234858 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd18cf05-a17f-4beb-9ffa-deab9b6b3e66" path="/var/lib/kubelet/pods/fd18cf05-a17f-4beb-9ffa-deab9b6b3e66/volumes" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.350915 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:42 crc kubenswrapper[4962]: I1003 14:08:42.823312 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pgxks"] Oct 03 14:08:43 crc kubenswrapper[4962]: I1003 14:08:43.465366 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pgxks" event={"ID":"db4ef18b-0548-463a-97d2-f344e26159d0","Type":"ContainerStarted","Data":"6fcc13beaecebd6a5793a93412e05426269048edcbba9c7ccffdebfce5ed1f10"} Oct 03 14:08:44 crc kubenswrapper[4962]: I1003 14:08:44.474220 4962 generic.go:334] "Generic (PLEG): container finished" podID="db4ef18b-0548-463a-97d2-f344e26159d0" containerID="241928001ffd500a49cb6a1d7b1d3aef0cb53d64dedfee05ce5b0356c5466cec" exitCode=0 Oct 03 14:08:44 crc kubenswrapper[4962]: I1003 14:08:44.474314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pgxks" event={"ID":"db4ef18b-0548-463a-97d2-f344e26159d0","Type":"ContainerDied","Data":"241928001ffd500a49cb6a1d7b1d3aef0cb53d64dedfee05ce5b0356c5466cec"} Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.778667 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.801398 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/db4ef18b-0548-463a-97d2-f344e26159d0-crc-storage\") pod \"db4ef18b-0548-463a-97d2-f344e26159d0\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.801484 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnwpg\" (UniqueName: \"kubernetes.io/projected/db4ef18b-0548-463a-97d2-f344e26159d0-kube-api-access-mnwpg\") pod \"db4ef18b-0548-463a-97d2-f344e26159d0\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.801568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/db4ef18b-0548-463a-97d2-f344e26159d0-node-mnt\") pod \"db4ef18b-0548-463a-97d2-f344e26159d0\" (UID: \"db4ef18b-0548-463a-97d2-f344e26159d0\") " Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.802137 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db4ef18b-0548-463a-97d2-f344e26159d0-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "db4ef18b-0548-463a-97d2-f344e26159d0" (UID: "db4ef18b-0548-463a-97d2-f344e26159d0"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.827497 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4ef18b-0548-463a-97d2-f344e26159d0-kube-api-access-mnwpg" (OuterVolumeSpecName: "kube-api-access-mnwpg") pod "db4ef18b-0548-463a-97d2-f344e26159d0" (UID: "db4ef18b-0548-463a-97d2-f344e26159d0"). InnerVolumeSpecName "kube-api-access-mnwpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.853161 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4ef18b-0548-463a-97d2-f344e26159d0-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "db4ef18b-0548-463a-97d2-f344e26159d0" (UID: "db4ef18b-0548-463a-97d2-f344e26159d0"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.902930 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/db4ef18b-0548-463a-97d2-f344e26159d0-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.902959 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnwpg\" (UniqueName: \"kubernetes.io/projected/db4ef18b-0548-463a-97d2-f344e26159d0-kube-api-access-mnwpg\") on node \"crc\" DevicePath \"\"" Oct 03 14:08:45 crc kubenswrapper[4962]: I1003 14:08:45.902969 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/db4ef18b-0548-463a-97d2-f344e26159d0-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 14:08:46 crc kubenswrapper[4962]: I1003 14:08:46.493569 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pgxks" event={"ID":"db4ef18b-0548-463a-97d2-f344e26159d0","Type":"ContainerDied","Data":"6fcc13beaecebd6a5793a93412e05426269048edcbba9c7ccffdebfce5ed1f10"} Oct 03 14:08:46 crc kubenswrapper[4962]: I1003 14:08:46.493906 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fcc13beaecebd6a5793a93412e05426269048edcbba9c7ccffdebfce5ed1f10" Oct 03 14:08:46 crc kubenswrapper[4962]: I1003 14:08:46.493632 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pgxks" Oct 03 14:09:00 crc kubenswrapper[4962]: I1003 14:09:00.069282 4962 scope.go:117] "RemoveContainer" containerID="8eafc8a9744a26bbc6dc88f8a3287d7e2a69ed60f17c4123a7b6f28fbceed0b3" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.622373 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sng6g"] Oct 03 14:09:26 crc kubenswrapper[4962]: E1003 14:09:26.623317 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4ef18b-0548-463a-97d2-f344e26159d0" containerName="storage" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.623336 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4ef18b-0548-463a-97d2-f344e26159d0" containerName="storage" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.623546 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4ef18b-0548-463a-97d2-f344e26159d0" containerName="storage" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.624608 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.631403 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sng6g"] Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.724076 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-catalog-content\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.724373 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-utilities\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.724532 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqzbh\" (UniqueName: \"kubernetes.io/projected/ceafd120-4066-48d1-85b5-e671972bb6f3-kube-api-access-zqzbh\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.826100 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-catalog-content\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.826184 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-utilities\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.826241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqzbh\" (UniqueName: \"kubernetes.io/projected/ceafd120-4066-48d1-85b5-e671972bb6f3-kube-api-access-zqzbh\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.826698 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-catalog-content\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.826744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-utilities\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.847905 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqzbh\" (UniqueName: \"kubernetes.io/projected/ceafd120-4066-48d1-85b5-e671972bb6f3-kube-api-access-zqzbh\") pod \"redhat-marketplace-sng6g\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:26 crc kubenswrapper[4962]: I1003 14:09:26.948620 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:27 crc kubenswrapper[4962]: I1003 14:09:27.348788 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sng6g"] Oct 03 14:09:27 crc kubenswrapper[4962]: I1003 14:09:27.861352 4962 generic.go:334] "Generic (PLEG): container finished" podID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerID="0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386" exitCode=0 Oct 03 14:09:27 crc kubenswrapper[4962]: I1003 14:09:27.861410 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sng6g" event={"ID":"ceafd120-4066-48d1-85b5-e671972bb6f3","Type":"ContainerDied","Data":"0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386"} Oct 03 14:09:27 crc kubenswrapper[4962]: I1003 14:09:27.861686 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sng6g" event={"ID":"ceafd120-4066-48d1-85b5-e671972bb6f3","Type":"ContainerStarted","Data":"e8688c4e5c6d09f5c5b9733ce8e8cae982b311aab5db5676859fba3cf08b8e0d"} Oct 03 14:09:29 crc kubenswrapper[4962]: I1003 14:09:29.877003 4962 generic.go:334] "Generic (PLEG): container finished" podID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerID="1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9" exitCode=0 Oct 03 14:09:29 crc kubenswrapper[4962]: I1003 14:09:29.877213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sng6g" event={"ID":"ceafd120-4066-48d1-85b5-e671972bb6f3","Type":"ContainerDied","Data":"1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9"} Oct 03 14:09:31 crc kubenswrapper[4962]: I1003 14:09:31.901678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sng6g" event={"ID":"ceafd120-4066-48d1-85b5-e671972bb6f3","Type":"ContainerStarted","Data":"ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949"} Oct 03 14:09:31 crc kubenswrapper[4962]: I1003 14:09:31.924755 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sng6g" podStartSLOduration=2.919400265 podStartE2EDuration="5.92473121s" podCreationTimestamp="2025-10-03 14:09:26 +0000 UTC" firstStartedPulling="2025-10-03 14:09:27.863202633 +0000 UTC m=+4776.267100458" lastFinishedPulling="2025-10-03 14:09:30.868533568 +0000 UTC m=+4779.272431403" observedRunningTime="2025-10-03 14:09:31.923050336 +0000 UTC m=+4780.326948221" watchObservedRunningTime="2025-10-03 14:09:31.92473121 +0000 UTC m=+4780.328629045" Oct 03 14:09:36 crc kubenswrapper[4962]: I1003 14:09:36.948805 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:36 crc kubenswrapper[4962]: I1003 14:09:36.949536 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:37 crc kubenswrapper[4962]: I1003 14:09:37.013967 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:38 crc kubenswrapper[4962]: I1003 14:09:38.025224 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:38 crc kubenswrapper[4962]: I1003 14:09:38.078070 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sng6g"] Oct 03 14:09:39 crc kubenswrapper[4962]: I1003 14:09:39.985590 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sng6g" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerName="registry-server" containerID="cri-o://ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949" gracePeriod=2 Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.434970 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.627744 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-utilities\") pod \"ceafd120-4066-48d1-85b5-e671972bb6f3\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.627828 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-catalog-content\") pod \"ceafd120-4066-48d1-85b5-e671972bb6f3\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.627850 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqzbh\" (UniqueName: \"kubernetes.io/projected/ceafd120-4066-48d1-85b5-e671972bb6f3-kube-api-access-zqzbh\") pod \"ceafd120-4066-48d1-85b5-e671972bb6f3\" (UID: \"ceafd120-4066-48d1-85b5-e671972bb6f3\") " Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.629271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-utilities" (OuterVolumeSpecName: "utilities") pod "ceafd120-4066-48d1-85b5-e671972bb6f3" (UID: "ceafd120-4066-48d1-85b5-e671972bb6f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.632718 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceafd120-4066-48d1-85b5-e671972bb6f3-kube-api-access-zqzbh" (OuterVolumeSpecName: "kube-api-access-zqzbh") pod "ceafd120-4066-48d1-85b5-e671972bb6f3" (UID: "ceafd120-4066-48d1-85b5-e671972bb6f3"). InnerVolumeSpecName "kube-api-access-zqzbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.640542 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceafd120-4066-48d1-85b5-e671972bb6f3" (UID: "ceafd120-4066-48d1-85b5-e671972bb6f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.729746 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.729785 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqzbh\" (UniqueName: \"kubernetes.io/projected/ceafd120-4066-48d1-85b5-e671972bb6f3-kube-api-access-zqzbh\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.729797 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceafd120-4066-48d1-85b5-e671972bb6f3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.994516 4962 generic.go:334] "Generic (PLEG): container finished" podID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerID="ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949" exitCode=0 Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.994568 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sng6g" event={"ID":"ceafd120-4066-48d1-85b5-e671972bb6f3","Type":"ContainerDied","Data":"ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949"} Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.994598 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sng6g" event={"ID":"ceafd120-4066-48d1-85b5-e671972bb6f3","Type":"ContainerDied","Data":"e8688c4e5c6d09f5c5b9733ce8e8cae982b311aab5db5676859fba3cf08b8e0d"} Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.994620 4962 scope.go:117] "RemoveContainer" containerID="ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949" Oct 03 14:09:40 crc kubenswrapper[4962]: I1003 14:09:40.994631 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sng6g" Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.022556 4962 scope.go:117] "RemoveContainer" containerID="1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9" Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.034294 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sng6g"] Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.039009 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sng6g"] Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.062381 4962 scope.go:117] "RemoveContainer" containerID="0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386" Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.082093 4962 scope.go:117] "RemoveContainer" containerID="ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949" Oct 03 14:09:41 crc kubenswrapper[4962]: E1003 14:09:41.082700 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949\": container with ID starting with ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949 not found: ID does not exist" containerID="ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949" Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.082742 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949"} err="failed to get container status \"ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949\": rpc error: code = NotFound desc = could not find container \"ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949\": container with ID starting with ed7625ee804c1e1df5d94b24c494326df59c4f3f1e1403954377590deda38949 not found: ID does not exist" Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.082764 4962 scope.go:117] "RemoveContainer" containerID="1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9" Oct 03 14:09:41 crc kubenswrapper[4962]: E1003 14:09:41.083172 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9\": container with ID starting with 1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9 not found: ID does not exist" containerID="1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9" Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.083296 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9"} err="failed to get container status \"1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9\": rpc error: code = NotFound desc = could not find container \"1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9\": container with ID starting with 1ec87a3efe0157442d78681082c91e68db4b0ae91f98f6952f200d8e1cdc66d9 not found: ID does not exist" Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.083397 4962 scope.go:117] "RemoveContainer" containerID="0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386" Oct 03 14:09:41 crc kubenswrapper[4962]: E1003 14:09:41.083822 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386\": container with ID starting with 0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386 not found: ID does not exist" containerID="0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386" Oct 03 14:09:41 crc kubenswrapper[4962]: I1003 14:09:41.083861 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386"} err="failed to get container status \"0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386\": rpc error: code = NotFound desc = could not find container \"0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386\": container with ID starting with 0213945b0af0c9eddd8eb4284e5eeb89ddfcff43070a231ffe3f5be54758b386 not found: ID does not exist" Oct 03 14:09:42 crc kubenswrapper[4962]: I1003 14:09:42.247247 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" path="/var/lib/kubelet/pods/ceafd120-4066-48d1-85b5-e671972bb6f3/volumes" Oct 03 14:10:24 crc kubenswrapper[4962]: I1003 14:10:24.663103 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:10:24 crc kubenswrapper[4962]: I1003 14:10:24.664115 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:10:54 crc kubenswrapper[4962]: I1003 14:10:54.660099 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:10:54 crc kubenswrapper[4962]: I1003 14:10:54.660700 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:11:24 crc kubenswrapper[4962]: I1003 14:11:24.662490 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:11:24 crc kubenswrapper[4962]: I1003 14:11:24.663271 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:11:24 crc kubenswrapper[4962]: I1003 14:11:24.663351 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:11:24 crc kubenswrapper[4962]: I1003 14:11:24.664196 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbb6e91cb75c7a810a75bc675a8082724b8bbf2a8a48579c538b63a351ed5feb"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:11:24 crc kubenswrapper[4962]: I1003 14:11:24.664284 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://cbb6e91cb75c7a810a75bc675a8082724b8bbf2a8a48579c538b63a351ed5feb" gracePeriod=600 Oct 03 14:11:24 crc kubenswrapper[4962]: I1003 14:11:24.861112 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="cbb6e91cb75c7a810a75bc675a8082724b8bbf2a8a48579c538b63a351ed5feb" exitCode=0 Oct 03 14:11:24 crc kubenswrapper[4962]: I1003 14:11:24.861224 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"cbb6e91cb75c7a810a75bc675a8082724b8bbf2a8a48579c538b63a351ed5feb"} Oct 03 14:11:24 crc kubenswrapper[4962]: I1003 14:11:24.861367 4962 scope.go:117] "RemoveContainer" containerID="78fd53b856a2c25b3aeed835db87e455f1051e67aa0b9805eefac674488eb7e4" Oct 03 14:11:25 crc kubenswrapper[4962]: I1003 14:11:25.873428 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc"} Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.785429 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sfvkx"] Oct 03 14:11:51 crc kubenswrapper[4962]: E1003 14:11:51.787386 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerName="extract-utilities" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.787424 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerName="extract-utilities" Oct 03 14:11:51 crc kubenswrapper[4962]: E1003 14:11:51.787482 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerName="registry-server" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.787496 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerName="registry-server" Oct 03 14:11:51 crc kubenswrapper[4962]: E1003 14:11:51.787528 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerName="extract-content" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.787544 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerName="extract-content" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.787930 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceafd120-4066-48d1-85b5-e671972bb6f3" containerName="registry-server" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.790857 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.806682 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sfvkx"] Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.871586 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69d69\" (UniqueName: \"kubernetes.io/projected/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-kube-api-access-69d69\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.872057 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-utilities\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.872095 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-catalog-content\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.973812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-utilities\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.973896 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-catalog-content\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.974054 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69d69\" (UniqueName: \"kubernetes.io/projected/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-kube-api-access-69d69\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.974413 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-utilities\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.974839 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-catalog-content\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:51 crc kubenswrapper[4962]: I1003 14:11:51.997735 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69d69\" (UniqueName: \"kubernetes.io/projected/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-kube-api-access-69d69\") pod \"redhat-operators-sfvkx\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:52 crc kubenswrapper[4962]: I1003 14:11:52.117119 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:11:52 crc kubenswrapper[4962]: I1003 14:11:52.627461 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sfvkx"] Oct 03 14:11:53 crc kubenswrapper[4962]: I1003 14:11:53.116338 4962 generic.go:334] "Generic (PLEG): container finished" podID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerID="b5d9060deffa1fc43b18fb4596d4ec8c92332da2a798a10a7fdd9b26a2e5cc47" exitCode=0 Oct 03 14:11:53 crc kubenswrapper[4962]: I1003 14:11:53.116417 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfvkx" event={"ID":"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60","Type":"ContainerDied","Data":"b5d9060deffa1fc43b18fb4596d4ec8c92332da2a798a10a7fdd9b26a2e5cc47"} Oct 03 14:11:53 crc kubenswrapper[4962]: I1003 14:11:53.116908 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfvkx" event={"ID":"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60","Type":"ContainerStarted","Data":"44ee85c61cb5fd5b5bec989ec4d15639926bdf3e3f2d43077e6b045d383efe09"} Oct 03 14:11:55 crc kubenswrapper[4962]: I1003 14:11:55.135174 4962 generic.go:334] "Generic (PLEG): container finished" podID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerID="6e6eb12231b7ddc49cc50220e8c17f616b62f0c3720ce67cf7491996050707f4" exitCode=0 Oct 03 14:11:55 crc kubenswrapper[4962]: I1003 14:11:55.135243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfvkx" event={"ID":"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60","Type":"ContainerDied","Data":"6e6eb12231b7ddc49cc50220e8c17f616b62f0c3720ce67cf7491996050707f4"} Oct 03 14:11:57 crc kubenswrapper[4962]: I1003 14:11:57.156177 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfvkx" event={"ID":"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60","Type":"ContainerStarted","Data":"be5d192f6fabf08f3a99d9c37582771cc1ff56f28a42198164aa8b1dcd122307"} Oct 03 14:11:57 crc kubenswrapper[4962]: I1003 14:11:57.181030 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sfvkx" podStartSLOduration=2.592424382 podStartE2EDuration="6.181002736s" podCreationTimestamp="2025-10-03 14:11:51 +0000 UTC" firstStartedPulling="2025-10-03 14:11:53.118291826 +0000 UTC m=+4921.522189661" lastFinishedPulling="2025-10-03 14:11:56.70687017 +0000 UTC m=+4925.110768015" observedRunningTime="2025-10-03 14:11:57.175471578 +0000 UTC m=+4925.579369433" watchObservedRunningTime="2025-10-03 14:11:57.181002736 +0000 UTC m=+4925.584900611" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.776355 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-578wq"] Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.779147 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.781052 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.781329 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.781875 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.786473 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.786504 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cbplh" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.793544 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-578wq"] Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.914897 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.914957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-config\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:11:59 crc kubenswrapper[4962]: I1003 14:11:59.914983 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p55b\" (UniqueName: \"kubernetes.io/projected/da4134cd-3c68-4bd0-ac1f-a557903d42eb-kube-api-access-4p55b\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.016133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.016216 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-config\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.016249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p55b\" (UniqueName: \"kubernetes.io/projected/da4134cd-3c68-4bd0-ac1f-a557903d42eb-kube-api-access-4p55b\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.017260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-config\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.017272 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.089223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p55b\" (UniqueName: \"kubernetes.io/projected/da4134cd-3c68-4bd0-ac1f-a557903d42eb-kube-api-access-4p55b\") pod \"dnsmasq-dns-5d7b5456f5-578wq\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.099065 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.120049 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-8wg2h"] Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.126931 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.180598 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-8wg2h"] Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.220257 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-config\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.220329 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.220364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcdhc\" (UniqueName: \"kubernetes.io/projected/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-kube-api-access-hcdhc\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.321847 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-config\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.321901 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.321932 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcdhc\" (UniqueName: \"kubernetes.io/projected/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-kube-api-access-hcdhc\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.323605 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-config\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.323749 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.363594 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcdhc\" (UniqueName: \"kubernetes.io/projected/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-kube-api-access-hcdhc\") pod \"dnsmasq-dns-98ddfc8f-8wg2h\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.501332 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.613921 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-578wq"] Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.950516 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.951801 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.953239 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.953366 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.953597 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.953763 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.953987 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lv5fj" Oct 03 14:12:00 crc kubenswrapper[4962]: I1003 14:12:00.962137 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.030680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.030751 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.030915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffj4l\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-kube-api-access-ffj4l\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.030972 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8341a841-5d4e-4824-9b74-255b401ab6e7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.031053 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8341a841-5d4e-4824-9b74-255b401ab6e7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.031201 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.031231 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.031261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.031285 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.115079 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-8wg2h"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132418 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132460 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132497 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132603 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffj4l\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-kube-api-access-ffj4l\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132628 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8341a841-5d4e-4824-9b74-255b401ab6e7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.132688 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8341a841-5d4e-4824-9b74-255b401ab6e7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.134886 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.136849 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.137131 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.137718 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.138824 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8341a841-5d4e-4824-9b74-255b401ab6e7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.153369 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8341a841-5d4e-4824-9b74-255b401ab6e7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.153661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.156170 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.156205 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4fc431a82d9d501e555eb4dc8b5d716e40442cc41995001ff7c8bed8b2246801/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.157486 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffj4l\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-kube-api-access-ffj4l\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.186372 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"rabbitmq-server-0\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.198745 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" event={"ID":"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84","Type":"ContainerStarted","Data":"2d40726bd920209642b192cf6875be020f55785705779816a3d14e4444a90c14"} Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.199591 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" event={"ID":"da4134cd-3c68-4bd0-ac1f-a557903d42eb","Type":"ContainerStarted","Data":"3aa17afa719d76aa4a066fa0b48b2641d1eeba4ffa1b7ca1353586706bad8e31"} Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.274239 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.277483 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.279019 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.280701 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.283893 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.283993 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.284194 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.284280 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tdtkt" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.298719 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.335683 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a6da80ce-3c4d-428f-a4df-de88071a59c9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.335801 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.335840 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.335868 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.335895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a6da80ce-3c4d-428f-a4df-de88071a59c9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.335918 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.335977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.336008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.336034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp6l2\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-kube-api-access-fp6l2\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.440517 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.440879 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.440907 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.440931 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a6da80ce-3c4d-428f-a4df-de88071a59c9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.440945 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.440996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.441020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.441040 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp6l2\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-kube-api-access-fp6l2\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.441260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.441537 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.441589 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a6da80ce-3c4d-428f-a4df-de88071a59c9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.441969 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.442242 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.447306 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a6da80ce-3c4d-428f-a4df-de88071a59c9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.447313 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a6da80ce-3c4d-428f-a4df-de88071a59c9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.447383 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.448129 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.448172 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fb3c2bf5c2662e5352ab1cd205a1cf9337f3211f558b7a9f8e9b13225859cd04/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.460812 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp6l2\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-kube-api-access-fp6l2\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.481567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.649143 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.650306 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.653468 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.654229 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.654503 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.654675 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.654692 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.655767 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-xsctj" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.658860 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.660575 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.737531 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-97bf9196-d192-41b1-a4d3-af2e57f4521b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97bf9196-d192-41b1-a4d3-af2e57f4521b\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750380 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750400 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-secrets\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750435 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750509 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750533 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phchv\" (UniqueName: \"kubernetes.io/projected/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-kube-api-access-phchv\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750559 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750584 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.750603 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.847935 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.849624 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851449 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851575 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-97bf9196-d192-41b1-a4d3-af2e57f4521b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97bf9196-d192-41b1-a4d3-af2e57f4521b\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851618 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851684 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-secrets\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851722 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851805 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phchv\" (UniqueName: \"kubernetes.io/projected/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-kube-api-access-phchv\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.851830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.852961 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.858167 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.858304 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.858958 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-secrets\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.859688 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.860042 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.860305 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.863857 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9k972" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.865342 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.865839 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.866163 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-97bf9196-d192-41b1-a4d3-af2e57f4521b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97bf9196-d192-41b1-a4d3-af2e57f4521b\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4d96f5f0fa3bd901bbfb4b1f63a159bccac99e27ba6a65ea7da43f2c5bcfa583/globalmount\"" pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.866314 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.889039 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phchv\" (UniqueName: \"kubernetes.io/projected/f3e7cb42-0fc6-4aac-aada-41b2f760b5e8-kube-api-access-phchv\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.925653 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-97bf9196-d192-41b1-a4d3-af2e57f4521b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97bf9196-d192-41b1-a4d3-af2e57f4521b\") pod \"openstack-galera-0\" (UID: \"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8\") " pod="openstack/openstack-galera-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.953086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/971e90e7-4cf4-4259-96f9-4ddd7aaec693-config-data\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.953146 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz9sn\" (UniqueName: \"kubernetes.io/projected/971e90e7-4cf4-4259-96f9-4ddd7aaec693-kube-api-access-lz9sn\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.953211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/971e90e7-4cf4-4259-96f9-4ddd7aaec693-kolla-config\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:01 crc kubenswrapper[4962]: I1003 14:12:01.967309 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.054806 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/971e90e7-4cf4-4259-96f9-4ddd7aaec693-config-data\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.054862 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz9sn\" (UniqueName: \"kubernetes.io/projected/971e90e7-4cf4-4259-96f9-4ddd7aaec693-kube-api-access-lz9sn\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.054899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/971e90e7-4cf4-4259-96f9-4ddd7aaec693-kolla-config\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.055920 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/971e90e7-4cf4-4259-96f9-4ddd7aaec693-kolla-config\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.056376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/971e90e7-4cf4-4259-96f9-4ddd7aaec693-config-data\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.077440 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz9sn\" (UniqueName: \"kubernetes.io/projected/971e90e7-4cf4-4259-96f9-4ddd7aaec693-kube-api-access-lz9sn\") pod \"memcached-0\" (UID: \"971e90e7-4cf4-4259-96f9-4ddd7aaec693\") " pod="openstack/memcached-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.118296 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.118696 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.133270 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.177441 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.215365 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.242797 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8341a841-5d4e-4824-9b74-255b401ab6e7","Type":"ContainerStarted","Data":"c3bbd038228836ed7da141247521304e6c5cdfd07a772c2684c3ba3782867fd8"} Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.243109 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6da80ce-3c4d-428f-a4df-de88071a59c9","Type":"ContainerStarted","Data":"f9b188f6af0809abecf439937d21624b787c3903f2a832362286de7adae97c7e"} Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.251895 4962 generic.go:334] "Generic (PLEG): container finished" podID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" containerID="27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770" exitCode=0 Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.251980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" event={"ID":"da4134cd-3c68-4bd0-ac1f-a557903d42eb","Type":"ContainerDied","Data":"27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770"} Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.256576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" event={"ID":"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84","Type":"ContainerStarted","Data":"26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2"} Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.324529 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.421495 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sfvkx"] Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.445745 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.682732 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 14:12:02 crc kubenswrapper[4962]: W1003 14:12:02.696856 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod971e90e7_4cf4_4259_96f9_4ddd7aaec693.slice/crio-3711afd1a6c4b3336c5802e739a139700de053e4587db02c92e7c628ff02baca WatchSource:0}: Error finding container 3711afd1a6c4b3336c5802e739a139700de053e4587db02c92e7c628ff02baca: Status 404 returned error can't find the container with id 3711afd1a6c4b3336c5802e739a139700de053e4587db02c92e7c628ff02baca Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.915685 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.923255 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.923905 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.927325 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.927470 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-6mtjz" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.927593 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 14:12:02 crc kubenswrapper[4962]: I1003 14:12:02.927721 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.068936 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql8mc\" (UniqueName: \"kubernetes.io/projected/19d8d7bb-0299-4cc9-95d6-956eec32d04a-kube-api-access-ql8mc\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.068999 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19d8d7bb-0299-4cc9-95d6-956eec32d04a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.069025 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.069191 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.069372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.069407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.069509 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.069611 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.069654 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-403aabcc-a78a-47ff-aeeb-1c65ecfbee25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-403aabcc-a78a-47ff-aeeb-1c65ecfbee25\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.171621 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.171779 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.171821 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-403aabcc-a78a-47ff-aeeb-1c65ecfbee25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-403aabcc-a78a-47ff-aeeb-1c65ecfbee25\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.171886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql8mc\" (UniqueName: \"kubernetes.io/projected/19d8d7bb-0299-4cc9-95d6-956eec32d04a-kube-api-access-ql8mc\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.171937 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19d8d7bb-0299-4cc9-95d6-956eec32d04a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.171971 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.172020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.172095 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.172140 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.174539 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19d8d7bb-0299-4cc9-95d6-956eec32d04a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.174892 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.175507 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.175537 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-403aabcc-a78a-47ff-aeeb-1c65ecfbee25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-403aabcc-a78a-47ff-aeeb-1c65ecfbee25\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1c8bacee68458ceaaa8282b31c499697084aefedf2353784dc2e38d72f1554a2/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.175865 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.176366 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19d8d7bb-0299-4cc9-95d6-956eec32d04a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.179099 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.181128 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.181286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d8d7bb-0299-4cc9-95d6-956eec32d04a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.212204 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql8mc\" (UniqueName: \"kubernetes.io/projected/19d8d7bb-0299-4cc9-95d6-956eec32d04a-kube-api-access-ql8mc\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.221229 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-403aabcc-a78a-47ff-aeeb-1c65ecfbee25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-403aabcc-a78a-47ff-aeeb-1c65ecfbee25\") pod \"openstack-cell1-galera-0\" (UID: \"19d8d7bb-0299-4cc9-95d6-956eec32d04a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.257489 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.265948 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8","Type":"ContainerStarted","Data":"6e04a9badba1d5d8e5d9a5e6653037265deea12b8eb6b238ea04313d55a1b381"} Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.267765 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"971e90e7-4cf4-4259-96f9-4ddd7aaec693","Type":"ContainerStarted","Data":"3711afd1a6c4b3336c5802e739a139700de053e4587db02c92e7c628ff02baca"} Oct 03 14:12:03 crc kubenswrapper[4962]: I1003 14:12:03.697268 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:12:04 crc kubenswrapper[4962]: I1003 14:12:04.275937 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"19d8d7bb-0299-4cc9-95d6-956eec32d04a","Type":"ContainerStarted","Data":"041bfc139b9fe951d09f29be4001bf8d2a2e673d6827bd85b16d484338081113"} Oct 03 14:12:04 crc kubenswrapper[4962]: I1003 14:12:04.276119 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sfvkx" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerName="registry-server" containerID="cri-o://be5d192f6fabf08f3a99d9c37582771cc1ff56f28a42198164aa8b1dcd122307" gracePeriod=2 Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.041743 4962 generic.go:334] "Generic (PLEG): container finished" podID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerID="be5d192f6fabf08f3a99d9c37582771cc1ff56f28a42198164aa8b1dcd122307" exitCode=0 Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.041813 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfvkx" event={"ID":"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60","Type":"ContainerDied","Data":"be5d192f6fabf08f3a99d9c37582771cc1ff56f28a42198164aa8b1dcd122307"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.069702 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.129178 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69d69\" (UniqueName: \"kubernetes.io/projected/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-kube-api-access-69d69\") pod \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.129271 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-catalog-content\") pod \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.129450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-utilities\") pod \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\" (UID: \"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60\") " Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.130595 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-utilities" (OuterVolumeSpecName: "utilities") pod "c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" (UID: "c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.137920 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-kube-api-access-69d69" (OuterVolumeSpecName: "kube-api-access-69d69") pod "c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" (UID: "c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60"). InnerVolumeSpecName "kube-api-access-69d69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.231241 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:12.231269 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69d69\" (UniqueName: \"kubernetes.io/projected/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-kube-api-access-69d69\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.053282 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"971e90e7-4cf4-4259-96f9-4ddd7aaec693","Type":"ContainerStarted","Data":"b7eb6e2a12d70e5af8b80b62f65c901655145e8df30ced858e34ca6a515da0e9"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.054324 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.055009 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"19d8d7bb-0299-4cc9-95d6-956eec32d04a","Type":"ContainerStarted","Data":"c77c7dd6a080e78acd9740bf3801587d5e988ff5abb2b360179107d4e179852e"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.057231 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfvkx" event={"ID":"c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60","Type":"ContainerDied","Data":"44ee85c61cb5fd5b5bec989ec4d15639926bdf3e3f2d43077e6b045d383efe09"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.057285 4962 scope.go:117] "RemoveContainer" containerID="be5d192f6fabf08f3a99d9c37582771cc1ff56f28a42198164aa8b1dcd122307" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.057416 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfvkx" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.058744 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8","Type":"ContainerStarted","Data":"8d2f3c5ac2591a764cbd2e11b2be6a384e9cbab99631559f6eff74e7b31051b1"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.061550 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" event={"ID":"da4134cd-3c68-4bd0-ac1f-a557903d42eb","Type":"ContainerStarted","Data":"9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.061788 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.064920 4962 generic.go:334] "Generic (PLEG): container finished" podID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" containerID="26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2" exitCode=0 Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.065013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" event={"ID":"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84","Type":"ContainerDied","Data":"26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.072355 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8341a841-5d4e-4824-9b74-255b401ab6e7","Type":"ContainerStarted","Data":"1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.075932 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6da80ce-3c4d-428f-a4df-de88071a59c9","Type":"ContainerStarted","Data":"27fd5f7561a149fa78c429ce0f79d7c33fe73a8a665d87bd6bcb28a783b4b2c3"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.080200 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.080179598 podStartE2EDuration="12.080179598s" podCreationTimestamp="2025-10-03 14:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:12:13.068862535 +0000 UTC m=+4941.472760390" watchObservedRunningTime="2025-10-03 14:12:13.080179598 +0000 UTC m=+4941.484077433" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.107134 4962 scope.go:117] "RemoveContainer" containerID="6e6eb12231b7ddc49cc50220e8c17f616b62f0c3720ce67cf7491996050707f4" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.117226 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" podStartSLOduration=14.11720689 podStartE2EDuration="14.11720689s" podCreationTimestamp="2025-10-03 14:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:12:13.116416779 +0000 UTC m=+4941.520314614" watchObservedRunningTime="2025-10-03 14:12:13.11720689 +0000 UTC m=+4941.521104715" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.178796 4962 scope.go:117] "RemoveContainer" containerID="b5d9060deffa1fc43b18fb4596d4ec8c92332da2a798a10a7fdd9b26a2e5cc47" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.631194 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" (UID: "c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.655974 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.690268 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sfvkx"] Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:13.696796 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sfvkx"] Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:14.087753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" event={"ID":"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84","Type":"ContainerStarted","Data":"f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a"} Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:14.150839 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" podStartSLOduration=14.150822799 podStartE2EDuration="14.150822799s" podCreationTimestamp="2025-10-03 14:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:12:14.144482959 +0000 UTC m=+4942.548380814" watchObservedRunningTime="2025-10-03 14:12:14.150822799 +0000 UTC m=+4942.554720634" Oct 03 14:12:14 crc kubenswrapper[4962]: I1003 14:12:14.234840 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" path="/var/lib/kubelet/pods/c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60/volumes" Oct 03 14:12:15 crc kubenswrapper[4962]: I1003 14:12:15.099755 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:15 crc kubenswrapper[4962]: I1003 14:12:15.104861 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:17 crc kubenswrapper[4962]: I1003 14:12:17.218817 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 14:12:20 crc kubenswrapper[4962]: I1003 14:12:20.503622 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:12:20 crc kubenswrapper[4962]: I1003 14:12:20.558481 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-578wq"] Oct 03 14:12:20 crc kubenswrapper[4962]: I1003 14:12:20.558774 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" podUID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" containerName="dnsmasq-dns" containerID="cri-o://9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a" gracePeriod=10 Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.008240 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.066018 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-config\") pod \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.066074 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p55b\" (UniqueName: \"kubernetes.io/projected/da4134cd-3c68-4bd0-ac1f-a557903d42eb-kube-api-access-4p55b\") pod \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.066119 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-dns-svc\") pod \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\" (UID: \"da4134cd-3c68-4bd0-ac1f-a557903d42eb\") " Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.071409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4134cd-3c68-4bd0-ac1f-a557903d42eb-kube-api-access-4p55b" (OuterVolumeSpecName: "kube-api-access-4p55b") pod "da4134cd-3c68-4bd0-ac1f-a557903d42eb" (UID: "da4134cd-3c68-4bd0-ac1f-a557903d42eb"). InnerVolumeSpecName "kube-api-access-4p55b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.107955 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-config" (OuterVolumeSpecName: "config") pod "da4134cd-3c68-4bd0-ac1f-a557903d42eb" (UID: "da4134cd-3c68-4bd0-ac1f-a557903d42eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.109977 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da4134cd-3c68-4bd0-ac1f-a557903d42eb" (UID: "da4134cd-3c68-4bd0-ac1f-a557903d42eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.148685 4962 generic.go:334] "Generic (PLEG): container finished" podID="f3e7cb42-0fc6-4aac-aada-41b2f760b5e8" containerID="8d2f3c5ac2591a764cbd2e11b2be6a384e9cbab99631559f6eff74e7b31051b1" exitCode=0 Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.148770 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8","Type":"ContainerDied","Data":"8d2f3c5ac2591a764cbd2e11b2be6a384e9cbab99631559f6eff74e7b31051b1"} Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.151340 4962 generic.go:334] "Generic (PLEG): container finished" podID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" containerID="9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a" exitCode=0 Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.151407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" event={"ID":"da4134cd-3c68-4bd0-ac1f-a557903d42eb","Type":"ContainerDied","Data":"9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a"} Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.151437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" event={"ID":"da4134cd-3c68-4bd0-ac1f-a557903d42eb","Type":"ContainerDied","Data":"3aa17afa719d76aa4a066fa0b48b2641d1eeba4ffa1b7ca1353586706bad8e31"} Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.151455 4962 scope.go:117] "RemoveContainer" containerID="9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.151545 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-578wq" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.156177 4962 generic.go:334] "Generic (PLEG): container finished" podID="19d8d7bb-0299-4cc9-95d6-956eec32d04a" containerID="c77c7dd6a080e78acd9740bf3801587d5e988ff5abb2b360179107d4e179852e" exitCode=0 Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.156321 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"19d8d7bb-0299-4cc9-95d6-956eec32d04a","Type":"ContainerDied","Data":"c77c7dd6a080e78acd9740bf3801587d5e988ff5abb2b360179107d4e179852e"} Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.167529 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.167563 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p55b\" (UniqueName: \"kubernetes.io/projected/da4134cd-3c68-4bd0-ac1f-a557903d42eb-kube-api-access-4p55b\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.167576 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da4134cd-3c68-4bd0-ac1f-a557903d42eb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.180536 4962 scope.go:117] "RemoveContainer" containerID="27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.211581 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-578wq"] Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.215034 4962 scope.go:117] "RemoveContainer" containerID="9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a" Oct 03 14:12:21 crc kubenswrapper[4962]: E1003 14:12:21.215394 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a\": container with ID starting with 9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a not found: ID does not exist" containerID="9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.215445 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a"} err="failed to get container status \"9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a\": rpc error: code = NotFound desc = could not find container \"9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a\": container with ID starting with 9f033bed9a1fd4ddb18e3cd76e05198795a520ebfe9e86967bf100eea1c69b4a not found: ID does not exist" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.215473 4962 scope.go:117] "RemoveContainer" containerID="27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770" Oct 03 14:12:21 crc kubenswrapper[4962]: E1003 14:12:21.216374 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770\": container with ID starting with 27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770 not found: ID does not exist" containerID="27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.216407 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770"} err="failed to get container status \"27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770\": rpc error: code = NotFound desc = could not find container \"27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770\": container with ID starting with 27658884c2ef6ed1f599e3ee8890187903f3cccfe8be86f25330f24e02687770 not found: ID does not exist" Oct 03 14:12:21 crc kubenswrapper[4962]: I1003 14:12:21.220933 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-578wq"] Oct 03 14:12:22 crc kubenswrapper[4962]: I1003 14:12:22.165080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f3e7cb42-0fc6-4aac-aada-41b2f760b5e8","Type":"ContainerStarted","Data":"8c98d9e7caac6276bb0a38b10a6b5f595fcadc600db9e94a99c46577156bc5bf"} Oct 03 14:12:22 crc kubenswrapper[4962]: I1003 14:12:22.170864 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"19d8d7bb-0299-4cc9-95d6-956eec32d04a","Type":"ContainerStarted","Data":"055f9a8bbd765ec320336f21f2a1902bb09e450f23bbd739de72ce80f3557847"} Oct 03 14:12:22 crc kubenswrapper[4962]: I1003 14:12:22.198253 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.198232428 podStartE2EDuration="22.198232428s" podCreationTimestamp="2025-10-03 14:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:12:22.195997938 +0000 UTC m=+4950.599895773" watchObservedRunningTime="2025-10-03 14:12:22.198232428 +0000 UTC m=+4950.602130263" Oct 03 14:12:22 crc kubenswrapper[4962]: I1003 14:12:22.218863 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.21884897 podStartE2EDuration="21.21884897s" podCreationTimestamp="2025-10-03 14:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:12:22.215620914 +0000 UTC m=+4950.619518749" watchObservedRunningTime="2025-10-03 14:12:22.21884897 +0000 UTC m=+4950.622746805" Oct 03 14:12:22 crc kubenswrapper[4962]: I1003 14:12:22.242430 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" path="/var/lib/kubelet/pods/da4134cd-3c68-4bd0-ac1f-a557903d42eb/volumes" Oct 03 14:12:23 crc kubenswrapper[4962]: I1003 14:12:23.258460 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:23 crc kubenswrapper[4962]: I1003 14:12:23.258533 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:27 crc kubenswrapper[4962]: I1003 14:12:27.314219 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:27 crc kubenswrapper[4962]: I1003 14:12:27.360973 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 14:12:31 crc kubenswrapper[4962]: I1003 14:12:31.967883 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 14:12:31 crc kubenswrapper[4962]: I1003 14:12:31.968266 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 14:12:32 crc kubenswrapper[4962]: I1003 14:12:32.009577 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 14:12:32 crc kubenswrapper[4962]: I1003 14:12:32.281449 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 14:12:44 crc kubenswrapper[4962]: I1003 14:12:44.345176 4962 generic.go:334] "Generic (PLEG): container finished" podID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerID="1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f" exitCode=0 Oct 03 14:12:44 crc kubenswrapper[4962]: I1003 14:12:44.345239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8341a841-5d4e-4824-9b74-255b401ab6e7","Type":"ContainerDied","Data":"1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f"} Oct 03 14:12:44 crc kubenswrapper[4962]: I1003 14:12:44.349580 4962 generic.go:334] "Generic (PLEG): container finished" podID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerID="27fd5f7561a149fa78c429ce0f79d7c33fe73a8a665d87bd6bcb28a783b4b2c3" exitCode=0 Oct 03 14:12:44 crc kubenswrapper[4962]: I1003 14:12:44.349668 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6da80ce-3c4d-428f-a4df-de88071a59c9","Type":"ContainerDied","Data":"27fd5f7561a149fa78c429ce0f79d7c33fe73a8a665d87bd6bcb28a783b4b2c3"} Oct 03 14:12:45 crc kubenswrapper[4962]: I1003 14:12:45.361836 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8341a841-5d4e-4824-9b74-255b401ab6e7","Type":"ContainerStarted","Data":"fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b"} Oct 03 14:12:45 crc kubenswrapper[4962]: I1003 14:12:45.362502 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 14:12:45 crc kubenswrapper[4962]: I1003 14:12:45.364405 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6da80ce-3c4d-428f-a4df-de88071a59c9","Type":"ContainerStarted","Data":"0d38ef1daa401c868b48b22295f680464a7faaf0b3bca55ce458f6241147344f"} Oct 03 14:12:45 crc kubenswrapper[4962]: I1003 14:12:45.364731 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:12:45 crc kubenswrapper[4962]: I1003 14:12:45.393125 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.393094978 podStartE2EDuration="46.393094978s" podCreationTimestamp="2025-10-03 14:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:12:45.383286885 +0000 UTC m=+4973.787184720" watchObservedRunningTime="2025-10-03 14:12:45.393094978 +0000 UTC m=+4973.796992813" Oct 03 14:12:45 crc kubenswrapper[4962]: I1003 14:12:45.415429 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.415408126 podStartE2EDuration="45.415408126s" podCreationTimestamp="2025-10-03 14:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:12:45.410345751 +0000 UTC m=+4973.814243586" watchObservedRunningTime="2025-10-03 14:12:45.415408126 +0000 UTC m=+4973.819305961" Oct 03 14:13:01 crc kubenswrapper[4962]: I1003 14:13:01.279878 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 14:13:01 crc kubenswrapper[4962]: I1003 14:13:01.657748 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.185709 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-kphgc"] Oct 03 14:13:05 crc kubenswrapper[4962]: E1003 14:13:05.186350 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerName="extract-content" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.186365 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerName="extract-content" Oct 03 14:13:05 crc kubenswrapper[4962]: E1003 14:13:05.186383 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerName="registry-server" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.186389 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerName="registry-server" Oct 03 14:13:05 crc kubenswrapper[4962]: E1003 14:13:05.186400 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerName="extract-utilities" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.186408 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerName="extract-utilities" Oct 03 14:13:05 crc kubenswrapper[4962]: E1003 14:13:05.186424 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" containerName="init" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.186429 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" containerName="init" Oct 03 14:13:05 crc kubenswrapper[4962]: E1003 14:13:05.186439 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" containerName="dnsmasq-dns" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.186444 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" containerName="dnsmasq-dns" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.186614 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c2efc5-edaf-4ebc-bcdc-14692e2f6f60" containerName="registry-server" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.186635 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4134cd-3c68-4bd0-ac1f-a557903d42eb" containerName="dnsmasq-dns" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.187453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.200692 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-kphgc"] Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.348277 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-config\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.348341 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.348401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgmvw\" (UniqueName: \"kubernetes.io/projected/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-kube-api-access-zgmvw\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.450298 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-config\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.450340 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.450430 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgmvw\" (UniqueName: \"kubernetes.io/projected/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-kube-api-access-zgmvw\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.451203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-config\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.451305 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.482745 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgmvw\" (UniqueName: \"kubernetes.io/projected/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-kube-api-access-zgmvw\") pod \"dnsmasq-dns-5b7946d7b9-kphgc\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.504961 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.878313 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:13:05 crc kubenswrapper[4962]: I1003 14:13:05.946592 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-kphgc"] Oct 03 14:13:05 crc kubenswrapper[4962]: W1003 14:13:05.952629 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa9e70ee_665d_4c2c_839d_9ca4de39ad16.slice/crio-6b70ae36d4d29f562bb0c7b7b94e1446a9de784db46dc2a5ee13f45bff45a63d WatchSource:0}: Error finding container 6b70ae36d4d29f562bb0c7b7b94e1446a9de784db46dc2a5ee13f45bff45a63d: Status 404 returned error can't find the container with id 6b70ae36d4d29f562bb0c7b7b94e1446a9de784db46dc2a5ee13f45bff45a63d Oct 03 14:13:06 crc kubenswrapper[4962]: I1003 14:13:06.565950 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:13:06 crc kubenswrapper[4962]: I1003 14:13:06.576475 4962 generic.go:334] "Generic (PLEG): container finished" podID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" containerID="6f26dd114c67bef234c8de8d46eb96c4eadf2eb30286b1c83308105c1f124f9e" exitCode=0 Oct 03 14:13:06 crc kubenswrapper[4962]: I1003 14:13:06.576532 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" event={"ID":"fa9e70ee-665d-4c2c-839d-9ca4de39ad16","Type":"ContainerDied","Data":"6f26dd114c67bef234c8de8d46eb96c4eadf2eb30286b1c83308105c1f124f9e"} Oct 03 14:13:06 crc kubenswrapper[4962]: I1003 14:13:06.576563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" event={"ID":"fa9e70ee-665d-4c2c-839d-9ca4de39ad16","Type":"ContainerStarted","Data":"6b70ae36d4d29f562bb0c7b7b94e1446a9de784db46dc2a5ee13f45bff45a63d"} Oct 03 14:13:07 crc kubenswrapper[4962]: I1003 14:13:07.587376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" event={"ID":"fa9e70ee-665d-4c2c-839d-9ca4de39ad16","Type":"ContainerStarted","Data":"80bcd65ea6c45d1def3c96932b1e02514f17c074a35bf9e6c7ae7c785308d78f"} Oct 03 14:13:07 crc kubenswrapper[4962]: I1003 14:13:07.587925 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:07 crc kubenswrapper[4962]: I1003 14:13:07.608809 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" podStartSLOduration=2.608778477 podStartE2EDuration="2.608778477s" podCreationTimestamp="2025-10-03 14:13:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:13:07.60319085 +0000 UTC m=+4996.007088685" watchObservedRunningTime="2025-10-03 14:13:07.608778477 +0000 UTC m=+4996.012676352" Oct 03 14:13:07 crc kubenswrapper[4962]: I1003 14:13:07.727999 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerName="rabbitmq" containerID="cri-o://fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b" gracePeriod=604799 Oct 03 14:13:08 crc kubenswrapper[4962]: I1003 14:13:08.390847 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerName="rabbitmq" containerID="cri-o://0d38ef1daa401c868b48b22295f680464a7faaf0b3bca55ce458f6241147344f" gracePeriod=604799 Oct 03 14:13:11 crc kubenswrapper[4962]: I1003 14:13:11.275363 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.238:5672: connect: connection refused" Oct 03 14:13:11 crc kubenswrapper[4962]: I1003 14:13:11.655759 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.239:5672: connect: connection refused" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.171760 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.283762 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-plugins-conf\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284089 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-server-conf\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284141 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8341a841-5d4e-4824-9b74-255b401ab6e7-pod-info\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284181 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-erlang-cookie\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284221 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8341a841-5d4e-4824-9b74-255b401ab6e7-erlang-cookie-secret\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284256 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffj4l\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-kube-api-access-ffj4l\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284303 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-confd\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284321 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-plugins\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284424 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"8341a841-5d4e-4824-9b74-255b401ab6e7\" (UID: \"8341a841-5d4e-4824-9b74-255b401ab6e7\") " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.284967 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.285140 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.286973 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.290434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-kube-api-access-ffj4l" (OuterVolumeSpecName: "kube-api-access-ffj4l") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "kube-api-access-ffj4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.290806 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8341a841-5d4e-4824-9b74-255b401ab6e7-pod-info" (OuterVolumeSpecName: "pod-info") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.295890 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8341a841-5d4e-4824-9b74-255b401ab6e7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.304150 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a" (OuterVolumeSpecName: "persistence") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.306624 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-server-conf" (OuterVolumeSpecName: "server-conf") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.366376 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8341a841-5d4e-4824-9b74-255b401ab6e7" (UID: "8341a841-5d4e-4824-9b74-255b401ab6e7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385575 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385613 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385676 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") on node \"crc\" " Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385695 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385707 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8341a841-5d4e-4824-9b74-255b401ab6e7-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385717 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8341a841-5d4e-4824-9b74-255b401ab6e7-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385728 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8341a841-5d4e-4824-9b74-255b401ab6e7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385740 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8341a841-5d4e-4824-9b74-255b401ab6e7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.385751 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffj4l\" (UniqueName: \"kubernetes.io/projected/8341a841-5d4e-4824-9b74-255b401ab6e7-kube-api-access-ffj4l\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.402630 4962 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.402818 4962 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a") on node "crc" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.486684 4962 reconciler_common.go:293] "Volume detached for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.644494 4962 generic.go:334] "Generic (PLEG): container finished" podID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerID="fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b" exitCode=0 Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.644578 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8341a841-5d4e-4824-9b74-255b401ab6e7","Type":"ContainerDied","Data":"fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b"} Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.644581 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.644609 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8341a841-5d4e-4824-9b74-255b401ab6e7","Type":"ContainerDied","Data":"c3bbd038228836ed7da141247521304e6c5cdfd07a772c2684c3ba3782867fd8"} Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.644629 4962 scope.go:117] "RemoveContainer" containerID="fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.647563 4962 generic.go:334] "Generic (PLEG): container finished" podID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerID="0d38ef1daa401c868b48b22295f680464a7faaf0b3bca55ce458f6241147344f" exitCode=0 Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.647609 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6da80ce-3c4d-428f-a4df-de88071a59c9","Type":"ContainerDied","Data":"0d38ef1daa401c868b48b22295f680464a7faaf0b3bca55ce458f6241147344f"} Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.696109 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.700989 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.707562 4962 scope.go:117] "RemoveContainer" containerID="1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.724226 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:13:14 crc kubenswrapper[4962]: E1003 14:13:14.724690 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerName="rabbitmq" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.724713 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerName="rabbitmq" Oct 03 14:13:14 crc kubenswrapper[4962]: E1003 14:13:14.724749 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerName="setup-container" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.724759 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerName="setup-container" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.724947 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8341a841-5d4e-4824-9b74-255b401ab6e7" containerName="rabbitmq" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.725851 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.727917 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.729315 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.729451 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lv5fj" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.729512 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.729462 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.733061 4962 scope.go:117] "RemoveContainer" containerID="fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b" Oct 03 14:13:14 crc kubenswrapper[4962]: E1003 14:13:14.733523 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b\": container with ID starting with fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b not found: ID does not exist" containerID="fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.733561 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b"} err="failed to get container status \"fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b\": rpc error: code = NotFound desc = could not find container \"fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b\": container with ID starting with fcc74e465c3aa725d832b935f763cce6998552df68b0bcaa0801a8cd0443449b not found: ID does not exist" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.733588 4962 scope.go:117] "RemoveContainer" containerID="1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f" Oct 03 14:13:14 crc kubenswrapper[4962]: E1003 14:13:14.734978 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f\": container with ID starting with 1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f not found: ID does not exist" containerID="1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.735022 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f"} err="failed to get container status \"1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f\": rpc error: code = NotFound desc = could not find container \"1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f\": container with ID starting with 1f4303a01b3531e3b467aa2a7310efed4c6d8e9f7fe94bb80bf996f70bc2356f not found: ID does not exist" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.761064 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.905270 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b27973f-0013-497d-8e30-0f94ffb4e651-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.905322 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.905361 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b27973f-0013-497d-8e30-0f94ffb4e651-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.905387 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qljz2\" (UniqueName: \"kubernetes.io/projected/9b27973f-0013-497d-8e30-0f94ffb4e651-kube-api-access-qljz2\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.905428 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.905455 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.907432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.907513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b27973f-0013-497d-8e30-0f94ffb4e651-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:14 crc kubenswrapper[4962]: I1003 14:13:14.907596 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b27973f-0013-497d-8e30-0f94ffb4e651-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.009778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b27973f-0013-497d-8e30-0f94ffb4e651-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010314 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010504 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b27973f-0013-497d-8e30-0f94ffb4e651-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010591 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b27973f-0013-497d-8e30-0f94ffb4e651-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010657 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010739 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b27973f-0013-497d-8e30-0f94ffb4e651-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010810 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qljz2\" (UniqueName: \"kubernetes.io/projected/9b27973f-0013-497d-8e30-0f94ffb4e651-kube-api-access-qljz2\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010909 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.010969 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.011509 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b27973f-0013-497d-8e30-0f94ffb4e651-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.011681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.012310 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b27973f-0013-497d-8e30-0f94ffb4e651-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.016810 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b27973f-0013-497d-8e30-0f94ffb4e651-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.016935 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b27973f-0013-497d-8e30-0f94ffb4e651-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.017593 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.017733 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4fc431a82d9d501e555eb4dc8b5d716e40442cc41995001ff7c8bed8b2246801/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.018088 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b27973f-0013-497d-8e30-0f94ffb4e651-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.021221 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.029247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qljz2\" (UniqueName: \"kubernetes.io/projected/9b27973f-0013-497d-8e30-0f94ffb4e651-kube-api-access-qljz2\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.064592 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f9ffb45-d2a6-4019-8194-99d39124f45a\") pod \"rabbitmq-server-0\" (UID: \"9b27973f-0013-497d-8e30-0f94ffb4e651\") " pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.212615 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-plugins-conf\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.212698 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a6da80ce-3c4d-428f-a4df-de88071a59c9-pod-info\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.212755 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-server-conf\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.212798 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a6da80ce-3c4d-428f-a4df-de88071a59c9-erlang-cookie-secret\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.212823 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-erlang-cookie\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.212878 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp6l2\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-kube-api-access-fp6l2\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.212902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-confd\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.212930 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-plugins\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.213070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"a6da80ce-3c4d-428f-a4df-de88071a59c9\" (UID: \"a6da80ce-3c4d-428f-a4df-de88071a59c9\") " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.213385 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.213585 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.214033 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.216859 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-kube-api-access-fp6l2" (OuterVolumeSpecName: "kube-api-access-fp6l2") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "kube-api-access-fp6l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.216899 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a6da80ce-3c4d-428f-a4df-de88071a59c9-pod-info" (OuterVolumeSpecName: "pod-info") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.217382 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6da80ce-3c4d-428f-a4df-de88071a59c9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.221806 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728" (OuterVolumeSpecName: "persistence") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "pvc-f1882988-f9b6-4eda-96c1-fccccad7a728". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.235695 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-server-conf" (OuterVolumeSpecName: "server-conf") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.290295 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a6da80ce-3c4d-428f-a4df-de88071a59c9" (UID: "a6da80ce-3c4d-428f-a4df-de88071a59c9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314805 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp6l2\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-kube-api-access-fp6l2\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314839 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314848 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314878 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") on node \"crc\" " Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314890 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314900 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a6da80ce-3c4d-428f-a4df-de88071a59c9-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314909 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a6da80ce-3c4d-428f-a4df-de88071a59c9-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314918 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a6da80ce-3c4d-428f-a4df-de88071a59c9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.314929 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a6da80ce-3c4d-428f-a4df-de88071a59c9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.330169 4962 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.330328 4962 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f1882988-f9b6-4eda-96c1-fccccad7a728" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728") on node "crc" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.360615 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.416475 4962 reconciler_common.go:293] "Volume detached for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.507073 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.570894 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-8wg2h"] Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.571129 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" podUID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" containerName="dnsmasq-dns" containerID="cri-o://f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a" gracePeriod=10 Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.657502 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6da80ce-3c4d-428f-a4df-de88071a59c9","Type":"ContainerDied","Data":"f9b188f6af0809abecf439937d21624b787c3903f2a832362286de7adae97c7e"} Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.657555 4962 scope.go:117] "RemoveContainer" containerID="0d38ef1daa401c868b48b22295f680464a7faaf0b3bca55ce458f6241147344f" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.657659 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.682060 4962 scope.go:117] "RemoveContainer" containerID="27fd5f7561a149fa78c429ce0f79d7c33fe73a8a665d87bd6bcb28a783b4b2c3" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.701214 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.722178 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.728321 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:13:15 crc kubenswrapper[4962]: E1003 14:13:15.731257 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerName="rabbitmq" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.731362 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerName="rabbitmq" Oct 03 14:13:15 crc kubenswrapper[4962]: E1003 14:13:15.731466 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerName="setup-container" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.731533 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerName="setup-container" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.731751 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6da80ce-3c4d-428f-a4df-de88071a59c9" containerName="rabbitmq" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.735433 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.741403 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.769362 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.769541 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.769698 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.769802 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.769902 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tdtkt" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.843249 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:13:15 crc kubenswrapper[4962]: W1003 14:13:15.849622 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b27973f_0013_497d_8e30_0f94ffb4e651.slice/crio-c770638ea1c47048e4f70c631d9630912287c4a235442203e6514dada50ddcdf WatchSource:0}: Error finding container c770638ea1c47048e4f70c631d9630912287c4a235442203e6514dada50ddcdf: Status 404 returned error can't find the container with id c770638ea1c47048e4f70c631d9630912287c4a235442203e6514dada50ddcdf Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.926632 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.927035 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.927056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tgrq\" (UniqueName: \"kubernetes.io/projected/748862b5-56ab-4601-bdd3-2c7d825f6c96-kube-api-access-2tgrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.927084 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/748862b5-56ab-4601-bdd3-2c7d825f6c96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.927103 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/748862b5-56ab-4601-bdd3-2c7d825f6c96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.927140 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/748862b5-56ab-4601-bdd3-2c7d825f6c96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.927163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.927191 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/748862b5-56ab-4601-bdd3-2c7d825f6c96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:15 crc kubenswrapper[4962]: I1003 14:13:15.927235 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/748862b5-56ab-4601-bdd3-2c7d825f6c96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028762 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028791 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/748862b5-56ab-4601-bdd3-2c7d825f6c96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028843 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028897 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tgrq\" (UniqueName: \"kubernetes.io/projected/748862b5-56ab-4601-bdd3-2c7d825f6c96-kube-api-access-2tgrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028940 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/748862b5-56ab-4601-bdd3-2c7d825f6c96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.028959 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/748862b5-56ab-4601-bdd3-2c7d825f6c96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.029816 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/748862b5-56ab-4601-bdd3-2c7d825f6c96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.030194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.030238 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.030602 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/748862b5-56ab-4601-bdd3-2c7d825f6c96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.033491 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.033535 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fb3c2bf5c2662e5352ab1cd205a1cf9337f3211f558b7a9f8e9b13225859cd04/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.034148 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/748862b5-56ab-4601-bdd3-2c7d825f6c96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.036921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/748862b5-56ab-4601-bdd3-2c7d825f6c96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.037087 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/748862b5-56ab-4601-bdd3-2c7d825f6c96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.049732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tgrq\" (UniqueName: \"kubernetes.io/projected/748862b5-56ab-4601-bdd3-2c7d825f6c96-kube-api-access-2tgrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.073931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1882988-f9b6-4eda-96c1-fccccad7a728\") pod \"rabbitmq-cell1-server-0\" (UID: \"748862b5-56ab-4601-bdd3-2c7d825f6c96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.089531 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.091970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.231135 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-dns-svc\") pod \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.231217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcdhc\" (UniqueName: \"kubernetes.io/projected/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-kube-api-access-hcdhc\") pod \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.231511 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-config\") pod \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\" (UID: \"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84\") " Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.234943 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-kube-api-access-hcdhc" (OuterVolumeSpecName: "kube-api-access-hcdhc") pod "8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" (UID: "8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84"). InnerVolumeSpecName "kube-api-access-hcdhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.238512 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8341a841-5d4e-4824-9b74-255b401ab6e7" path="/var/lib/kubelet/pods/8341a841-5d4e-4824-9b74-255b401ab6e7/volumes" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.239887 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6da80ce-3c4d-428f-a4df-de88071a59c9" path="/var/lib/kubelet/pods/a6da80ce-3c4d-428f-a4df-de88071a59c9/volumes" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.266712 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-config" (OuterVolumeSpecName: "config") pod "8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" (UID: "8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.267490 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" (UID: "8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.333887 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.333931 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.333947 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcdhc\" (UniqueName: \"kubernetes.io/projected/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84-kube-api-access-hcdhc\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.489910 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:13:16 crc kubenswrapper[4962]: W1003 14:13:16.569728 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod748862b5_56ab_4601_bdd3_2c7d825f6c96.slice/crio-a87ee6488d35dca65e60bcba13f46b34a41c47a054882453a64c360cc13b20ea WatchSource:0}: Error finding container a87ee6488d35dca65e60bcba13f46b34a41c47a054882453a64c360cc13b20ea: Status 404 returned error can't find the container with id a87ee6488d35dca65e60bcba13f46b34a41c47a054882453a64c360cc13b20ea Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.669962 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"748862b5-56ab-4601-bdd3-2c7d825f6c96","Type":"ContainerStarted","Data":"a87ee6488d35dca65e60bcba13f46b34a41c47a054882453a64c360cc13b20ea"} Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.671736 4962 generic.go:334] "Generic (PLEG): container finished" podID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" containerID="f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a" exitCode=0 Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.671804 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" event={"ID":"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84","Type":"ContainerDied","Data":"f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a"} Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.671828 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.671861 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-8wg2h" event={"ID":"8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84","Type":"ContainerDied","Data":"2d40726bd920209642b192cf6875be020f55785705779816a3d14e4444a90c14"} Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.671884 4962 scope.go:117] "RemoveContainer" containerID="f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.677290 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b27973f-0013-497d-8e30-0f94ffb4e651","Type":"ContainerStarted","Data":"c770638ea1c47048e4f70c631d9630912287c4a235442203e6514dada50ddcdf"} Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.697687 4962 scope.go:117] "RemoveContainer" containerID="26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.712758 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-8wg2h"] Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.718502 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-8wg2h"] Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.723795 4962 scope.go:117] "RemoveContainer" containerID="f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a" Oct 03 14:13:16 crc kubenswrapper[4962]: E1003 14:13:16.724242 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a\": container with ID starting with f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a not found: ID does not exist" containerID="f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.724365 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a"} err="failed to get container status \"f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a\": rpc error: code = NotFound desc = could not find container \"f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a\": container with ID starting with f4c9068742b32f2bf27d65e6210806f54d38663209e7fb73338fb527643c207a not found: ID does not exist" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.724462 4962 scope.go:117] "RemoveContainer" containerID="26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2" Oct 03 14:13:16 crc kubenswrapper[4962]: E1003 14:13:16.724792 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2\": container with ID starting with 26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2 not found: ID does not exist" containerID="26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2" Oct 03 14:13:16 crc kubenswrapper[4962]: I1003 14:13:16.724909 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2"} err="failed to get container status \"26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2\": rpc error: code = NotFound desc = could not find container \"26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2\": container with ID starting with 26a6e21e82f698a78f8c46ef81e4dc617213fbff4c43481906fe3b35c4725dc2 not found: ID does not exist" Oct 03 14:13:17 crc kubenswrapper[4962]: I1003 14:13:17.685996 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b27973f-0013-497d-8e30-0f94ffb4e651","Type":"ContainerStarted","Data":"0bb114ee18b17a46d6e16bcf28f3055d9b74d5c8be9e3702869203d5ef5cf758"} Oct 03 14:13:17 crc kubenswrapper[4962]: I1003 14:13:17.687296 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"748862b5-56ab-4601-bdd3-2c7d825f6c96","Type":"ContainerStarted","Data":"d031c8f3cf62301ec557f8ff38d793de25eeb7e66071f0c2272f928cb1886860"} Oct 03 14:13:18 crc kubenswrapper[4962]: I1003 14:13:18.234655 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" path="/var/lib/kubelet/pods/8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84/volumes" Oct 03 14:13:24 crc kubenswrapper[4962]: I1003 14:13:24.659413 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:13:24 crc kubenswrapper[4962]: I1003 14:13:24.660473 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:13:49 crc kubenswrapper[4962]: I1003 14:13:49.939423 4962 generic.go:334] "Generic (PLEG): container finished" podID="748862b5-56ab-4601-bdd3-2c7d825f6c96" containerID="d031c8f3cf62301ec557f8ff38d793de25eeb7e66071f0c2272f928cb1886860" exitCode=0 Oct 03 14:13:49 crc kubenswrapper[4962]: I1003 14:13:49.939577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"748862b5-56ab-4601-bdd3-2c7d825f6c96","Type":"ContainerDied","Data":"d031c8f3cf62301ec557f8ff38d793de25eeb7e66071f0c2272f928cb1886860"} Oct 03 14:13:49 crc kubenswrapper[4962]: I1003 14:13:49.944573 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b27973f-0013-497d-8e30-0f94ffb4e651" containerID="0bb114ee18b17a46d6e16bcf28f3055d9b74d5c8be9e3702869203d5ef5cf758" exitCode=0 Oct 03 14:13:49 crc kubenswrapper[4962]: I1003 14:13:49.944694 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b27973f-0013-497d-8e30-0f94ffb4e651","Type":"ContainerDied","Data":"0bb114ee18b17a46d6e16bcf28f3055d9b74d5c8be9e3702869203d5ef5cf758"} Oct 03 14:13:50 crc kubenswrapper[4962]: I1003 14:13:50.954029 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"748862b5-56ab-4601-bdd3-2c7d825f6c96","Type":"ContainerStarted","Data":"521cc74adc3a1074bbdc208e3a00bf3a6037c889a0c284b22cb238df1427c09c"} Oct 03 14:13:50 crc kubenswrapper[4962]: I1003 14:13:50.955524 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:13:50 crc kubenswrapper[4962]: I1003 14:13:50.957485 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9b27973f-0013-497d-8e30-0f94ffb4e651","Type":"ContainerStarted","Data":"a28402ce8eed9c35edc33b33f574b80bfbc2ff0f98dee8bb53f1d65da2db731a"} Oct 03 14:13:50 crc kubenswrapper[4962]: I1003 14:13:50.957655 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 14:13:51 crc kubenswrapper[4962]: I1003 14:13:51.006250 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.006201778 podStartE2EDuration="36.006201778s" podCreationTimestamp="2025-10-03 14:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:13:50.982239861 +0000 UTC m=+5039.386137696" watchObservedRunningTime="2025-10-03 14:13:51.006201778 +0000 UTC m=+5039.410099613" Oct 03 14:13:51 crc kubenswrapper[4962]: I1003 14:13:51.007428 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.007421189 podStartE2EDuration="37.007421189s" podCreationTimestamp="2025-10-03 14:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:13:51.002766258 +0000 UTC m=+5039.406664113" watchObservedRunningTime="2025-10-03 14:13:51.007421189 +0000 UTC m=+5039.411319024" Oct 03 14:13:54 crc kubenswrapper[4962]: I1003 14:13:54.660403 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:13:54 crc kubenswrapper[4962]: I1003 14:13:54.661503 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:14:05 crc kubenswrapper[4962]: I1003 14:14:05.364551 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 14:14:06 crc kubenswrapper[4962]: I1003 14:14:06.099523 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.697527 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 14:14:12 crc kubenswrapper[4962]: E1003 14:14:12.698588 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" containerName="init" Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.698605 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" containerName="init" Oct 03 14:14:12 crc kubenswrapper[4962]: E1003 14:14:12.698654 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" containerName="dnsmasq-dns" Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.698664 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" containerName="dnsmasq-dns" Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.698844 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dbd5b71-5a25-4b9b-9d70-21fd1be2bd84" containerName="dnsmasq-dns" Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.699423 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.701990 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jldbx" Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.706937 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.895987 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7tnl\" (UniqueName: \"kubernetes.io/projected/1de7604b-340a-4153-9b3e-61235d1fe085-kube-api-access-d7tnl\") pod \"mariadb-client-1-default\" (UID: \"1de7604b-340a-4153-9b3e-61235d1fe085\") " pod="openstack/mariadb-client-1-default" Oct 03 14:14:12 crc kubenswrapper[4962]: I1003 14:14:12.997443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7tnl\" (UniqueName: \"kubernetes.io/projected/1de7604b-340a-4153-9b3e-61235d1fe085-kube-api-access-d7tnl\") pod \"mariadb-client-1-default\" (UID: \"1de7604b-340a-4153-9b3e-61235d1fe085\") " pod="openstack/mariadb-client-1-default" Oct 03 14:14:13 crc kubenswrapper[4962]: I1003 14:14:13.035131 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7tnl\" (UniqueName: \"kubernetes.io/projected/1de7604b-340a-4153-9b3e-61235d1fe085-kube-api-access-d7tnl\") pod \"mariadb-client-1-default\" (UID: \"1de7604b-340a-4153-9b3e-61235d1fe085\") " pod="openstack/mariadb-client-1-default" Oct 03 14:14:13 crc kubenswrapper[4962]: I1003 14:14:13.322974 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 14:14:13 crc kubenswrapper[4962]: I1003 14:14:13.902244 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 14:14:14 crc kubenswrapper[4962]: I1003 14:14:14.143895 4962 generic.go:334] "Generic (PLEG): container finished" podID="1de7604b-340a-4153-9b3e-61235d1fe085" containerID="df926cb0d567cee684d7129fdbd9a8927c475803ed36a5e239cf7dca6366f36a" exitCode=0 Oct 03 14:14:14 crc kubenswrapper[4962]: I1003 14:14:14.143965 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"1de7604b-340a-4153-9b3e-61235d1fe085","Type":"ContainerDied","Data":"df926cb0d567cee684d7129fdbd9a8927c475803ed36a5e239cf7dca6366f36a"} Oct 03 14:14:14 crc kubenswrapper[4962]: I1003 14:14:14.143995 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"1de7604b-340a-4153-9b3e-61235d1fe085","Type":"ContainerStarted","Data":"6bca3e1ee98fc812e3f091622531bf1f55708d0beccbd1dc1f2bccc7a88c7c0e"} Oct 03 14:14:16 crc kubenswrapper[4962]: I1003 14:14:16.809577 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 14:14:16 crc kubenswrapper[4962]: I1003 14:14:16.838569 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_1de7604b-340a-4153-9b3e-61235d1fe085/mariadb-client-1-default/0.log" Oct 03 14:14:16 crc kubenswrapper[4962]: I1003 14:14:16.874002 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 14:14:16 crc kubenswrapper[4962]: I1003 14:14:16.880072 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 14:14:16 crc kubenswrapper[4962]: I1003 14:14:16.970371 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7tnl\" (UniqueName: \"kubernetes.io/projected/1de7604b-340a-4153-9b3e-61235d1fe085-kube-api-access-d7tnl\") pod \"1de7604b-340a-4153-9b3e-61235d1fe085\" (UID: \"1de7604b-340a-4153-9b3e-61235d1fe085\") " Oct 03 14:14:16 crc kubenswrapper[4962]: I1003 14:14:16.977951 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de7604b-340a-4153-9b3e-61235d1fe085-kube-api-access-d7tnl" (OuterVolumeSpecName: "kube-api-access-d7tnl") pod "1de7604b-340a-4153-9b3e-61235d1fe085" (UID: "1de7604b-340a-4153-9b3e-61235d1fe085"). InnerVolumeSpecName "kube-api-access-d7tnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.072496 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7tnl\" (UniqueName: \"kubernetes.io/projected/1de7604b-340a-4153-9b3e-61235d1fe085-kube-api-access-d7tnl\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.173587 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bca3e1ee98fc812e3f091622531bf1f55708d0beccbd1dc1f2bccc7a88c7c0e" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.173772 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.376979 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 14:14:17 crc kubenswrapper[4962]: E1003 14:14:17.377532 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de7604b-340a-4153-9b3e-61235d1fe085" containerName="mariadb-client-1-default" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.377558 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de7604b-340a-4153-9b3e-61235d1fe085" containerName="mariadb-client-1-default" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.377738 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de7604b-340a-4153-9b3e-61235d1fe085" containerName="mariadb-client-1-default" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.378276 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.382252 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jldbx" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.386171 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.481684 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4664\" (UniqueName: \"kubernetes.io/projected/5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2-kube-api-access-c4664\") pod \"mariadb-client-2-default\" (UID: \"5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2\") " pod="openstack/mariadb-client-2-default" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.583817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4664\" (UniqueName: \"kubernetes.io/projected/5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2-kube-api-access-c4664\") pod \"mariadb-client-2-default\" (UID: \"5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2\") " pod="openstack/mariadb-client-2-default" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.607775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4664\" (UniqueName: \"kubernetes.io/projected/5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2-kube-api-access-c4664\") pod \"mariadb-client-2-default\" (UID: \"5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2\") " pod="openstack/mariadb-client-2-default" Oct 03 14:14:17 crc kubenswrapper[4962]: I1003 14:14:17.724264 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 14:14:18 crc kubenswrapper[4962]: I1003 14:14:18.244767 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de7604b-340a-4153-9b3e-61235d1fe085" path="/var/lib/kubelet/pods/1de7604b-340a-4153-9b3e-61235d1fe085/volumes" Oct 03 14:14:18 crc kubenswrapper[4962]: I1003 14:14:18.245975 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 14:14:19 crc kubenswrapper[4962]: I1003 14:14:19.189563 4962 generic.go:334] "Generic (PLEG): container finished" podID="5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2" containerID="b6787005a97e750f7a69212a26ea8171fd15215b2955a70357c0652f5a897ca5" exitCode=0 Oct 03 14:14:19 crc kubenswrapper[4962]: I1003 14:14:19.189620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2","Type":"ContainerDied","Data":"b6787005a97e750f7a69212a26ea8171fd15215b2955a70357c0652f5a897ca5"} Oct 03 14:14:19 crc kubenswrapper[4962]: I1003 14:14:19.189706 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2","Type":"ContainerStarted","Data":"4625b69e88033d39649adf99a0c2d3f69f84001d27a59e7bc26ba2be667386c8"} Oct 03 14:14:20 crc kubenswrapper[4962]: I1003 14:14:20.564695 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 14:14:20 crc kubenswrapper[4962]: I1003 14:14:20.602014 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2/mariadb-client-2-default/0.log" Oct 03 14:14:20 crc kubenswrapper[4962]: I1003 14:14:20.628151 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 14:14:20 crc kubenswrapper[4962]: I1003 14:14:20.631627 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4664\" (UniqueName: \"kubernetes.io/projected/5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2-kube-api-access-c4664\") pod \"5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2\" (UID: \"5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2\") " Oct 03 14:14:20 crc kubenswrapper[4962]: I1003 14:14:20.632140 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 14:14:20 crc kubenswrapper[4962]: I1003 14:14:20.637295 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2-kube-api-access-c4664" (OuterVolumeSpecName: "kube-api-access-c4664") pod "5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2" (UID: "5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2"). InnerVolumeSpecName "kube-api-access-c4664". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:20 crc kubenswrapper[4962]: I1003 14:14:20.733254 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4664\" (UniqueName: \"kubernetes.io/projected/5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2-kube-api-access-c4664\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.107114 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 03 14:14:21 crc kubenswrapper[4962]: E1003 14:14:21.107404 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2" containerName="mariadb-client-2-default" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.107418 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2" containerName="mariadb-client-2-default" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.107563 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2" containerName="mariadb-client-2-default" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.108049 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.118420 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.139765 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrfp\" (UniqueName: \"kubernetes.io/projected/2d3f5976-1c2b-48fc-8a80-84c652f2e536-kube-api-access-xwrfp\") pod \"mariadb-client-1\" (UID: \"2d3f5976-1c2b-48fc-8a80-84c652f2e536\") " pod="openstack/mariadb-client-1" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.203983 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4625b69e88033d39649adf99a0c2d3f69f84001d27a59e7bc26ba2be667386c8" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.204059 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.241700 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrfp\" (UniqueName: \"kubernetes.io/projected/2d3f5976-1c2b-48fc-8a80-84c652f2e536-kube-api-access-xwrfp\") pod \"mariadb-client-1\" (UID: \"2d3f5976-1c2b-48fc-8a80-84c652f2e536\") " pod="openstack/mariadb-client-1" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.260830 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrfp\" (UniqueName: \"kubernetes.io/projected/2d3f5976-1c2b-48fc-8a80-84c652f2e536-kube-api-access-xwrfp\") pod \"mariadb-client-1\" (UID: \"2d3f5976-1c2b-48fc-8a80-84c652f2e536\") " pod="openstack/mariadb-client-1" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.426543 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 14:14:21 crc kubenswrapper[4962]: I1003 14:14:21.928023 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 14:14:22 crc kubenswrapper[4962]: I1003 14:14:22.211441 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2d3f5976-1c2b-48fc-8a80-84c652f2e536","Type":"ContainerStarted","Data":"ac22f2278ad6869cbcddd0d9ecf9295e3809db7744a4b74c40e27e170103d6a9"} Oct 03 14:14:22 crc kubenswrapper[4962]: I1003 14:14:22.236800 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2" path="/var/lib/kubelet/pods/5bae3e8d-f9e6-4f5b-83fa-ec09b66613a2/volumes" Oct 03 14:14:23 crc kubenswrapper[4962]: I1003 14:14:23.221712 4962 generic.go:334] "Generic (PLEG): container finished" podID="2d3f5976-1c2b-48fc-8a80-84c652f2e536" containerID="b08a8f93526e17aab47e01efa43d0c3e6556618901e5d77bd7871b90c85a0c76" exitCode=0 Oct 03 14:14:23 crc kubenswrapper[4962]: I1003 14:14:23.221751 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2d3f5976-1c2b-48fc-8a80-84c652f2e536","Type":"ContainerDied","Data":"b08a8f93526e17aab47e01efa43d0c3e6556618901e5d77bd7871b90c85a0c76"} Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.534284 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.550880 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_2d3f5976-1c2b-48fc-8a80-84c652f2e536/mariadb-client-1/0.log" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.577028 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.581652 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.660194 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.660249 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.660321 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.660951 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.661009 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" gracePeriod=600 Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.693206 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwrfp\" (UniqueName: \"kubernetes.io/projected/2d3f5976-1c2b-48fc-8a80-84c652f2e536-kube-api-access-xwrfp\") pod \"2d3f5976-1c2b-48fc-8a80-84c652f2e536\" (UID: \"2d3f5976-1c2b-48fc-8a80-84c652f2e536\") " Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.698925 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3f5976-1c2b-48fc-8a80-84c652f2e536-kube-api-access-xwrfp" (OuterVolumeSpecName: "kube-api-access-xwrfp") pod "2d3f5976-1c2b-48fc-8a80-84c652f2e536" (UID: "2d3f5976-1c2b-48fc-8a80-84c652f2e536"). InnerVolumeSpecName "kube-api-access-xwrfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:24 crc kubenswrapper[4962]: E1003 14:14:24.783279 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.794806 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwrfp\" (UniqueName: \"kubernetes.io/projected/2d3f5976-1c2b-48fc-8a80-84c652f2e536-kube-api-access-xwrfp\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.988597 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 14:14:24 crc kubenswrapper[4962]: E1003 14:14:24.989322 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3f5976-1c2b-48fc-8a80-84c652f2e536" containerName="mariadb-client-1" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.989362 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3f5976-1c2b-48fc-8a80-84c652f2e536" containerName="mariadb-client-1" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.989817 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3f5976-1c2b-48fc-8a80-84c652f2e536" containerName="mariadb-client-1" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.990889 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 14:14:24 crc kubenswrapper[4962]: I1003 14:14:24.994557 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.123560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6cmz\" (UniqueName: \"kubernetes.io/projected/50a99653-ef3e-4772-9764-0da71a6cf797-kube-api-access-d6cmz\") pod \"mariadb-client-4-default\" (UID: \"50a99653-ef3e-4772-9764-0da71a6cf797\") " pod="openstack/mariadb-client-4-default" Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.224999 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6cmz\" (UniqueName: \"kubernetes.io/projected/50a99653-ef3e-4772-9764-0da71a6cf797-kube-api-access-d6cmz\") pod \"mariadb-client-4-default\" (UID: \"50a99653-ef3e-4772-9764-0da71a6cf797\") " pod="openstack/mariadb-client-4-default" Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.236408 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac22f2278ad6869cbcddd0d9ecf9295e3809db7744a4b74c40e27e170103d6a9" Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.236424 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.240173 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" exitCode=0 Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.240215 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc"} Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.240250 4962 scope.go:117] "RemoveContainer" containerID="cbb6e91cb75c7a810a75bc675a8082724b8bbf2a8a48579c538b63a351ed5feb" Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.241389 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:14:25 crc kubenswrapper[4962]: E1003 14:14:25.241891 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.244429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6cmz\" (UniqueName: \"kubernetes.io/projected/50a99653-ef3e-4772-9764-0da71a6cf797-kube-api-access-d6cmz\") pod \"mariadb-client-4-default\" (UID: \"50a99653-ef3e-4772-9764-0da71a6cf797\") " pod="openstack/mariadb-client-4-default" Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.335266 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 14:14:25 crc kubenswrapper[4962]: I1003 14:14:25.585447 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 14:14:25 crc kubenswrapper[4962]: W1003 14:14:25.592473 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50a99653_ef3e_4772_9764_0da71a6cf797.slice/crio-ea3ae0c1797c41aa64f0b9d3858967087af2d22e024374e602c5446b41f4fdf7 WatchSource:0}: Error finding container ea3ae0c1797c41aa64f0b9d3858967087af2d22e024374e602c5446b41f4fdf7: Status 404 returned error can't find the container with id ea3ae0c1797c41aa64f0b9d3858967087af2d22e024374e602c5446b41f4fdf7 Oct 03 14:14:26 crc kubenswrapper[4962]: I1003 14:14:26.239512 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d3f5976-1c2b-48fc-8a80-84c652f2e536" path="/var/lib/kubelet/pods/2d3f5976-1c2b-48fc-8a80-84c652f2e536/volumes" Oct 03 14:14:26 crc kubenswrapper[4962]: I1003 14:14:26.250428 4962 generic.go:334] "Generic (PLEG): container finished" podID="50a99653-ef3e-4772-9764-0da71a6cf797" containerID="71bbe27f8b7b8d5459e8bcb705a03cce7d63eb958d4bf59a9f445fde4327c04c" exitCode=0 Oct 03 14:14:26 crc kubenswrapper[4962]: I1003 14:14:26.250540 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"50a99653-ef3e-4772-9764-0da71a6cf797","Type":"ContainerDied","Data":"71bbe27f8b7b8d5459e8bcb705a03cce7d63eb958d4bf59a9f445fde4327c04c"} Oct 03 14:14:26 crc kubenswrapper[4962]: I1003 14:14:26.250809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"50a99653-ef3e-4772-9764-0da71a6cf797","Type":"ContainerStarted","Data":"ea3ae0c1797c41aa64f0b9d3858967087af2d22e024374e602c5446b41f4fdf7"} Oct 03 14:14:27 crc kubenswrapper[4962]: I1003 14:14:27.636881 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 14:14:27 crc kubenswrapper[4962]: I1003 14:14:27.660766 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_50a99653-ef3e-4772-9764-0da71a6cf797/mariadb-client-4-default/0.log" Oct 03 14:14:27 crc kubenswrapper[4962]: I1003 14:14:27.689215 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 14:14:27 crc kubenswrapper[4962]: I1003 14:14:27.694837 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 14:14:27 crc kubenswrapper[4962]: I1003 14:14:27.764976 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6cmz\" (UniqueName: \"kubernetes.io/projected/50a99653-ef3e-4772-9764-0da71a6cf797-kube-api-access-d6cmz\") pod \"50a99653-ef3e-4772-9764-0da71a6cf797\" (UID: \"50a99653-ef3e-4772-9764-0da71a6cf797\") " Oct 03 14:14:27 crc kubenswrapper[4962]: I1003 14:14:27.770606 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a99653-ef3e-4772-9764-0da71a6cf797-kube-api-access-d6cmz" (OuterVolumeSpecName: "kube-api-access-d6cmz") pod "50a99653-ef3e-4772-9764-0da71a6cf797" (UID: "50a99653-ef3e-4772-9764-0da71a6cf797"). InnerVolumeSpecName "kube-api-access-d6cmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:27 crc kubenswrapper[4962]: I1003 14:14:27.866958 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6cmz\" (UniqueName: \"kubernetes.io/projected/50a99653-ef3e-4772-9764-0da71a6cf797-kube-api-access-d6cmz\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:28 crc kubenswrapper[4962]: I1003 14:14:28.236180 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a99653-ef3e-4772-9764-0da71a6cf797" path="/var/lib/kubelet/pods/50a99653-ef3e-4772-9764-0da71a6cf797/volumes" Oct 03 14:14:28 crc kubenswrapper[4962]: I1003 14:14:28.271180 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 14:14:28 crc kubenswrapper[4962]: I1003 14:14:28.271429 4962 scope.go:117] "RemoveContainer" containerID="71bbe27f8b7b8d5459e8bcb705a03cce7d63eb958d4bf59a9f445fde4327c04c" Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.733506 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 14:14:32 crc kubenswrapper[4962]: E1003 14:14:32.734184 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a99653-ef3e-4772-9764-0da71a6cf797" containerName="mariadb-client-4-default" Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.734197 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a99653-ef3e-4772-9764-0da71a6cf797" containerName="mariadb-client-4-default" Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.734365 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a99653-ef3e-4772-9764-0da71a6cf797" containerName="mariadb-client-4-default" Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.734845 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.739896 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jldbx" Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.745041 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.836470 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkdqr\" (UniqueName: \"kubernetes.io/projected/343ade40-57a4-4e6f-a727-d1149c7fe222-kube-api-access-tkdqr\") pod \"mariadb-client-5-default\" (UID: \"343ade40-57a4-4e6f-a727-d1149c7fe222\") " pod="openstack/mariadb-client-5-default" Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.938225 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkdqr\" (UniqueName: \"kubernetes.io/projected/343ade40-57a4-4e6f-a727-d1149c7fe222-kube-api-access-tkdqr\") pod \"mariadb-client-5-default\" (UID: \"343ade40-57a4-4e6f-a727-d1149c7fe222\") " pod="openstack/mariadb-client-5-default" Oct 03 14:14:32 crc kubenswrapper[4962]: I1003 14:14:32.967408 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkdqr\" (UniqueName: \"kubernetes.io/projected/343ade40-57a4-4e6f-a727-d1149c7fe222-kube-api-access-tkdqr\") pod \"mariadb-client-5-default\" (UID: \"343ade40-57a4-4e6f-a727-d1149c7fe222\") " pod="openstack/mariadb-client-5-default" Oct 03 14:14:33 crc kubenswrapper[4962]: I1003 14:14:33.057816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 14:14:33 crc kubenswrapper[4962]: I1003 14:14:33.556697 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 14:14:34 crc kubenswrapper[4962]: I1003 14:14:34.320570 4962 generic.go:334] "Generic (PLEG): container finished" podID="343ade40-57a4-4e6f-a727-d1149c7fe222" containerID="ed5ce7c1f15939a24c92be67a279dc36845781a637786c77e379e53069a7b2ae" exitCode=0 Oct 03 14:14:34 crc kubenswrapper[4962]: I1003 14:14:34.320623 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"343ade40-57a4-4e6f-a727-d1149c7fe222","Type":"ContainerDied","Data":"ed5ce7c1f15939a24c92be67a279dc36845781a637786c77e379e53069a7b2ae"} Oct 03 14:14:34 crc kubenswrapper[4962]: I1003 14:14:34.320667 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"343ade40-57a4-4e6f-a727-d1149c7fe222","Type":"ContainerStarted","Data":"ba3f7d7b6aec5d4539d2f483716c26a24934aa597f784e30445751ac6f1f6144"} Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.715823 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.736113 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_343ade40-57a4-4e6f-a727-d1149c7fe222/mariadb-client-5-default/0.log" Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.765807 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.769998 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.884391 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkdqr\" (UniqueName: \"kubernetes.io/projected/343ade40-57a4-4e6f-a727-d1149c7fe222-kube-api-access-tkdqr\") pod \"343ade40-57a4-4e6f-a727-d1149c7fe222\" (UID: \"343ade40-57a4-4e6f-a727-d1149c7fe222\") " Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.890092 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343ade40-57a4-4e6f-a727-d1149c7fe222-kube-api-access-tkdqr" (OuterVolumeSpecName: "kube-api-access-tkdqr") pod "343ade40-57a4-4e6f-a727-d1149c7fe222" (UID: "343ade40-57a4-4e6f-a727-d1149c7fe222"). InnerVolumeSpecName "kube-api-access-tkdqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.945503 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 14:14:35 crc kubenswrapper[4962]: E1003 14:14:35.945865 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343ade40-57a4-4e6f-a727-d1149c7fe222" containerName="mariadb-client-5-default" Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.945889 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="343ade40-57a4-4e6f-a727-d1149c7fe222" containerName="mariadb-client-5-default" Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.946062 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="343ade40-57a4-4e6f-a727-d1149c7fe222" containerName="mariadb-client-5-default" Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.946593 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.953433 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 14:14:35 crc kubenswrapper[4962]: I1003 14:14:35.985987 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkdqr\" (UniqueName: \"kubernetes.io/projected/343ade40-57a4-4e6f-a727-d1149c7fe222-kube-api-access-tkdqr\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:36 crc kubenswrapper[4962]: I1003 14:14:36.087690 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqgwp\" (UniqueName: \"kubernetes.io/projected/49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a-kube-api-access-dqgwp\") pod \"mariadb-client-6-default\" (UID: \"49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a\") " pod="openstack/mariadb-client-6-default" Oct 03 14:14:36 crc kubenswrapper[4962]: I1003 14:14:36.188913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqgwp\" (UniqueName: \"kubernetes.io/projected/49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a-kube-api-access-dqgwp\") pod \"mariadb-client-6-default\" (UID: \"49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a\") " pod="openstack/mariadb-client-6-default" Oct 03 14:14:36 crc kubenswrapper[4962]: I1003 14:14:36.209577 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqgwp\" (UniqueName: \"kubernetes.io/projected/49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a-kube-api-access-dqgwp\") pod \"mariadb-client-6-default\" (UID: \"49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a\") " pod="openstack/mariadb-client-6-default" Oct 03 14:14:36 crc kubenswrapper[4962]: I1003 14:14:36.238257 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343ade40-57a4-4e6f-a727-d1149c7fe222" path="/var/lib/kubelet/pods/343ade40-57a4-4e6f-a727-d1149c7fe222/volumes" Oct 03 14:14:36 crc kubenswrapper[4962]: I1003 14:14:36.274492 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 14:14:36 crc kubenswrapper[4962]: I1003 14:14:36.341182 4962 scope.go:117] "RemoveContainer" containerID="ed5ce7c1f15939a24c92be67a279dc36845781a637786c77e379e53069a7b2ae" Oct 03 14:14:36 crc kubenswrapper[4962]: I1003 14:14:36.341282 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 14:14:36 crc kubenswrapper[4962]: I1003 14:14:36.547024 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 14:14:36 crc kubenswrapper[4962]: W1003 14:14:36.552821 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d0a73c_b8dd_4c9f_8b6e_52c3e5de6d9a.slice/crio-f5f4db3bc1ee24631d7e4e0b735606be513c2b95c5c05ecfd0a62ea47f0a78a0 WatchSource:0}: Error finding container f5f4db3bc1ee24631d7e4e0b735606be513c2b95c5c05ecfd0a62ea47f0a78a0: Status 404 returned error can't find the container with id f5f4db3bc1ee24631d7e4e0b735606be513c2b95c5c05ecfd0a62ea47f0a78a0 Oct 03 14:14:37 crc kubenswrapper[4962]: I1003 14:14:37.352736 4962 generic.go:334] "Generic (PLEG): container finished" podID="49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a" containerID="3761e0efd9c845d897100c81fefe7c049bf5a5027429c42e04585b63b5a307c9" exitCode=0 Oct 03 14:14:37 crc kubenswrapper[4962]: I1003 14:14:37.352931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a","Type":"ContainerDied","Data":"3761e0efd9c845d897100c81fefe7c049bf5a5027429c42e04585b63b5a307c9"} Oct 03 14:14:37 crc kubenswrapper[4962]: I1003 14:14:37.354756 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a","Type":"ContainerStarted","Data":"f5f4db3bc1ee24631d7e4e0b735606be513c2b95c5c05ecfd0a62ea47f0a78a0"} Oct 03 14:14:38 crc kubenswrapper[4962]: I1003 14:14:38.730512 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 14:14:38 crc kubenswrapper[4962]: I1003 14:14:38.776519 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a/mariadb-client-6-default/0.log" Oct 03 14:14:38 crc kubenswrapper[4962]: I1003 14:14:38.806477 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 14:14:38 crc kubenswrapper[4962]: I1003 14:14:38.811656 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 14:14:38 crc kubenswrapper[4962]: I1003 14:14:38.830625 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqgwp\" (UniqueName: \"kubernetes.io/projected/49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a-kube-api-access-dqgwp\") pod \"49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a\" (UID: \"49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a\") " Oct 03 14:14:38 crc kubenswrapper[4962]: I1003 14:14:38.835880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a-kube-api-access-dqgwp" (OuterVolumeSpecName: "kube-api-access-dqgwp") pod "49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a" (UID: "49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a"). InnerVolumeSpecName "kube-api-access-dqgwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:38 crc kubenswrapper[4962]: I1003 14:14:38.932912 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqgwp\" (UniqueName: \"kubernetes.io/projected/49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a-kube-api-access-dqgwp\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:38.986659 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 14:14:39 crc kubenswrapper[4962]: E1003 14:14:38.987407 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a" containerName="mariadb-client-6-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:38.987423 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a" containerName="mariadb-client-6-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:38.987594 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a" containerName="mariadb-client-6-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:38.988099 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.031084 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.134972 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll85j\" (UniqueName: \"kubernetes.io/projected/b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b-kube-api-access-ll85j\") pod \"mariadb-client-7-default\" (UID: \"b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b\") " pod="openstack/mariadb-client-7-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.242022 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:14:39 crc kubenswrapper[4962]: E1003 14:14:39.243334 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.243400 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll85j\" (UniqueName: \"kubernetes.io/projected/b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b-kube-api-access-ll85j\") pod \"mariadb-client-7-default\" (UID: \"b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b\") " pod="openstack/mariadb-client-7-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.265809 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll85j\" (UniqueName: \"kubernetes.io/projected/b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b-kube-api-access-ll85j\") pod \"mariadb-client-7-default\" (UID: \"b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b\") " pod="openstack/mariadb-client-7-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.340664 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.390122 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5f4db3bc1ee24631d7e4e0b735606be513c2b95c5c05ecfd0a62ea47f0a78a0" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.390214 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 14:14:39 crc kubenswrapper[4962]: I1003 14:14:39.648445 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 14:14:39 crc kubenswrapper[4962]: W1003 14:14:39.652801 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb40f19ff_c8c5_4f6c_9c4e_970d349eaf1b.slice/crio-1668a248a77a6c358d4c05f6c748647450d7fbb311a16c64be81d6a1a9d59959 WatchSource:0}: Error finding container 1668a248a77a6c358d4c05f6c748647450d7fbb311a16c64be81d6a1a9d59959: Status 404 returned error can't find the container with id 1668a248a77a6c358d4c05f6c748647450d7fbb311a16c64be81d6a1a9d59959 Oct 03 14:14:40 crc kubenswrapper[4962]: I1003 14:14:40.239334 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a" path="/var/lib/kubelet/pods/49d0a73c-b8dd-4c9f-8b6e-52c3e5de6d9a/volumes" Oct 03 14:14:40 crc kubenswrapper[4962]: I1003 14:14:40.398543 4962 generic.go:334] "Generic (PLEG): container finished" podID="b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b" containerID="004582271d3ae61986423d64e963fa463291230dad5784459505f5a6260401b2" exitCode=0 Oct 03 14:14:40 crc kubenswrapper[4962]: I1003 14:14:40.398629 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b","Type":"ContainerDied","Data":"004582271d3ae61986423d64e963fa463291230dad5784459505f5a6260401b2"} Oct 03 14:14:40 crc kubenswrapper[4962]: I1003 14:14:40.398675 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b","Type":"ContainerStarted","Data":"1668a248a77a6c358d4c05f6c748647450d7fbb311a16c64be81d6a1a9d59959"} Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.728763 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.748326 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b/mariadb-client-7-default/0.log" Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.773942 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.779178 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.889109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll85j\" (UniqueName: \"kubernetes.io/projected/b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b-kube-api-access-ll85j\") pod \"b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b\" (UID: \"b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b\") " Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.894886 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b-kube-api-access-ll85j" (OuterVolumeSpecName: "kube-api-access-ll85j") pod "b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b" (UID: "b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b"). InnerVolumeSpecName "kube-api-access-ll85j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.960137 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 03 14:14:41 crc kubenswrapper[4962]: E1003 14:14:41.960496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b" containerName="mariadb-client-7-default" Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.960512 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b" containerName="mariadb-client-7-default" Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.960728 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b" containerName="mariadb-client-7-default" Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.961414 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.967923 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 14:14:41 crc kubenswrapper[4962]: I1003 14:14:41.990783 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll85j\" (UniqueName: \"kubernetes.io/projected/b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b-kube-api-access-ll85j\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:42 crc kubenswrapper[4962]: I1003 14:14:42.092146 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvsb7\" (UniqueName: \"kubernetes.io/projected/543ffb88-996f-4b26-8a73-53a81ada6746-kube-api-access-lvsb7\") pod \"mariadb-client-2\" (UID: \"543ffb88-996f-4b26-8a73-53a81ada6746\") " pod="openstack/mariadb-client-2" Oct 03 14:14:42 crc kubenswrapper[4962]: I1003 14:14:42.193881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvsb7\" (UniqueName: \"kubernetes.io/projected/543ffb88-996f-4b26-8a73-53a81ada6746-kube-api-access-lvsb7\") pod \"mariadb-client-2\" (UID: \"543ffb88-996f-4b26-8a73-53a81ada6746\") " pod="openstack/mariadb-client-2" Oct 03 14:14:42 crc kubenswrapper[4962]: I1003 14:14:42.210573 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvsb7\" (UniqueName: \"kubernetes.io/projected/543ffb88-996f-4b26-8a73-53a81ada6746-kube-api-access-lvsb7\") pod \"mariadb-client-2\" (UID: \"543ffb88-996f-4b26-8a73-53a81ada6746\") " pod="openstack/mariadb-client-2" Oct 03 14:14:42 crc kubenswrapper[4962]: I1003 14:14:42.237385 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b" path="/var/lib/kubelet/pods/b40f19ff-c8c5-4f6c-9c4e-970d349eaf1b/volumes" Oct 03 14:14:42 crc kubenswrapper[4962]: I1003 14:14:42.286803 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 14:14:42 crc kubenswrapper[4962]: I1003 14:14:42.434757 4962 scope.go:117] "RemoveContainer" containerID="004582271d3ae61986423d64e963fa463291230dad5784459505f5a6260401b2" Oct 03 14:14:42 crc kubenswrapper[4962]: I1003 14:14:42.434814 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 14:14:42 crc kubenswrapper[4962]: I1003 14:14:42.833847 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 14:14:43 crc kubenswrapper[4962]: I1003 14:14:43.442995 4962 generic.go:334] "Generic (PLEG): container finished" podID="543ffb88-996f-4b26-8a73-53a81ada6746" containerID="34d9749e8d5764e8be5c836d09d86b0f52d5e5ef5899e8f79cdcf248d3592323" exitCode=0 Oct 03 14:14:43 crc kubenswrapper[4962]: I1003 14:14:43.443068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"543ffb88-996f-4b26-8a73-53a81ada6746","Type":"ContainerDied","Data":"34d9749e8d5764e8be5c836d09d86b0f52d5e5ef5899e8f79cdcf248d3592323"} Oct 03 14:14:43 crc kubenswrapper[4962]: I1003 14:14:43.443353 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"543ffb88-996f-4b26-8a73-53a81ada6746","Type":"ContainerStarted","Data":"307b4616602c94d44eb5f2b597aa3628067a2829d16019edc3fd9907e0cef14f"} Oct 03 14:14:44 crc kubenswrapper[4962]: I1003 14:14:44.863277 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 14:14:44 crc kubenswrapper[4962]: I1003 14:14:44.881459 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_543ffb88-996f-4b26-8a73-53a81ada6746/mariadb-client-2/0.log" Oct 03 14:14:44 crc kubenswrapper[4962]: I1003 14:14:44.914052 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 14:14:44 crc kubenswrapper[4962]: I1003 14:14:44.918680 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 14:14:45 crc kubenswrapper[4962]: I1003 14:14:45.033134 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvsb7\" (UniqueName: \"kubernetes.io/projected/543ffb88-996f-4b26-8a73-53a81ada6746-kube-api-access-lvsb7\") pod \"543ffb88-996f-4b26-8a73-53a81ada6746\" (UID: \"543ffb88-996f-4b26-8a73-53a81ada6746\") " Oct 03 14:14:45 crc kubenswrapper[4962]: I1003 14:14:45.041260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543ffb88-996f-4b26-8a73-53a81ada6746-kube-api-access-lvsb7" (OuterVolumeSpecName: "kube-api-access-lvsb7") pod "543ffb88-996f-4b26-8a73-53a81ada6746" (UID: "543ffb88-996f-4b26-8a73-53a81ada6746"). InnerVolumeSpecName "kube-api-access-lvsb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:45 crc kubenswrapper[4962]: I1003 14:14:45.134441 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvsb7\" (UniqueName: \"kubernetes.io/projected/543ffb88-996f-4b26-8a73-53a81ada6746-kube-api-access-lvsb7\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:45 crc kubenswrapper[4962]: I1003 14:14:45.459832 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="307b4616602c94d44eb5f2b597aa3628067a2829d16019edc3fd9907e0cef14f" Oct 03 14:14:45 crc kubenswrapper[4962]: I1003 14:14:45.459916 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 14:14:46 crc kubenswrapper[4962]: I1003 14:14:46.243989 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543ffb88-996f-4b26-8a73-53a81ada6746" path="/var/lib/kubelet/pods/543ffb88-996f-4b26-8a73-53a81ada6746/volumes" Oct 03 14:14:50 crc kubenswrapper[4962]: I1003 14:14:50.228339 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:14:50 crc kubenswrapper[4962]: E1003 14:14:50.229159 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.133211 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wgtd8"] Oct 03 14:14:53 crc kubenswrapper[4962]: E1003 14:14:53.135108 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543ffb88-996f-4b26-8a73-53a81ada6746" containerName="mariadb-client-2" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.135178 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="543ffb88-996f-4b26-8a73-53a81ada6746" containerName="mariadb-client-2" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.135376 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="543ffb88-996f-4b26-8a73-53a81ada6746" containerName="mariadb-client-2" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.136543 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.160187 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgtd8"] Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.268935 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qt7\" (UniqueName: \"kubernetes.io/projected/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-kube-api-access-l7qt7\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.269239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-catalog-content\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.269267 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-utilities\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.370897 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-utilities\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.371070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qt7\" (UniqueName: \"kubernetes.io/projected/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-kube-api-access-l7qt7\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.371116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-catalog-content\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.371535 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-utilities\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.371560 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-catalog-content\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.394584 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qt7\" (UniqueName: \"kubernetes.io/projected/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-kube-api-access-l7qt7\") pod \"community-operators-wgtd8\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.463605 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:14:53 crc kubenswrapper[4962]: I1003 14:14:53.934360 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgtd8"] Oct 03 14:14:53 crc kubenswrapper[4962]: W1003 14:14:53.940917 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0873f3b1_04c3_4bb9_9634_6b9b69f5af36.slice/crio-a7175844baa866a78ac9b3434d9eaa70ada413077c201df2576402cf0d937767 WatchSource:0}: Error finding container a7175844baa866a78ac9b3434d9eaa70ada413077c201df2576402cf0d937767: Status 404 returned error can't find the container with id a7175844baa866a78ac9b3434d9eaa70ada413077c201df2576402cf0d937767 Oct 03 14:14:54 crc kubenswrapper[4962]: I1003 14:14:54.530114 4962 generic.go:334] "Generic (PLEG): container finished" podID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerID="c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf" exitCode=0 Oct 03 14:14:54 crc kubenswrapper[4962]: I1003 14:14:54.530183 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgtd8" event={"ID":"0873f3b1-04c3-4bb9-9634-6b9b69f5af36","Type":"ContainerDied","Data":"c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf"} Oct 03 14:14:54 crc kubenswrapper[4962]: I1003 14:14:54.530473 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgtd8" event={"ID":"0873f3b1-04c3-4bb9-9634-6b9b69f5af36","Type":"ContainerStarted","Data":"a7175844baa866a78ac9b3434d9eaa70ada413077c201df2576402cf0d937767"} Oct 03 14:14:54 crc kubenswrapper[4962]: I1003 14:14:54.532539 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:14:55 crc kubenswrapper[4962]: I1003 14:14:55.540141 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgtd8" event={"ID":"0873f3b1-04c3-4bb9-9634-6b9b69f5af36","Type":"ContainerStarted","Data":"001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa"} Oct 03 14:14:55 crc kubenswrapper[4962]: E1003 14:14:55.856934 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0873f3b1_04c3_4bb9_9634_6b9b69f5af36.slice/crio-001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0873f3b1_04c3_4bb9_9634_6b9b69f5af36.slice/crio-conmon-001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:14:56 crc kubenswrapper[4962]: I1003 14:14:56.553552 4962 generic.go:334] "Generic (PLEG): container finished" podID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerID="001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa" exitCode=0 Oct 03 14:14:56 crc kubenswrapper[4962]: I1003 14:14:56.553614 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgtd8" event={"ID":"0873f3b1-04c3-4bb9-9634-6b9b69f5af36","Type":"ContainerDied","Data":"001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa"} Oct 03 14:14:57 crc kubenswrapper[4962]: I1003 14:14:57.573783 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgtd8" event={"ID":"0873f3b1-04c3-4bb9-9634-6b9b69f5af36","Type":"ContainerStarted","Data":"33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244"} Oct 03 14:14:57 crc kubenswrapper[4962]: I1003 14:14:57.605372 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wgtd8" podStartSLOduration=2.158990835 podStartE2EDuration="4.605348621s" podCreationTimestamp="2025-10-03 14:14:53 +0000 UTC" firstStartedPulling="2025-10-03 14:14:54.532341881 +0000 UTC m=+5102.936239716" lastFinishedPulling="2025-10-03 14:14:56.978699677 +0000 UTC m=+5105.382597502" observedRunningTime="2025-10-03 14:14:57.604061938 +0000 UTC m=+5106.007959803" watchObservedRunningTime="2025-10-03 14:14:57.605348621 +0000 UTC m=+5106.009246466" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.151030 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq"] Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.158172 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.160664 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.161074 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.164013 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq"] Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.175242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/808fc01e-573f-4a85-bae4-c1864fb86c1f-config-volume\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.175383 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/808fc01e-573f-4a85-bae4-c1864fb86c1f-secret-volume\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.175529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtxp2\" (UniqueName: \"kubernetes.io/projected/808fc01e-573f-4a85-bae4-c1864fb86c1f-kube-api-access-jtxp2\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.277430 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtxp2\" (UniqueName: \"kubernetes.io/projected/808fc01e-573f-4a85-bae4-c1864fb86c1f-kube-api-access-jtxp2\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.277510 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/808fc01e-573f-4a85-bae4-c1864fb86c1f-config-volume\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.277561 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/808fc01e-573f-4a85-bae4-c1864fb86c1f-secret-volume\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.278751 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/808fc01e-573f-4a85-bae4-c1864fb86c1f-config-volume\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.290837 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/808fc01e-573f-4a85-bae4-c1864fb86c1f-secret-volume\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.296942 4962 scope.go:117] "RemoveContainer" containerID="1cb4130e4e19a9663f3fd738929f6a3f9dc1c390dbb07524ecc3218c81df8ed3" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.301556 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtxp2\" (UniqueName: \"kubernetes.io/projected/808fc01e-573f-4a85-bae4-c1864fb86c1f-kube-api-access-jtxp2\") pod \"collect-profiles-29325015-phqfq\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.481579 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:00 crc kubenswrapper[4962]: I1003 14:15:00.720420 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq"] Oct 03 14:15:00 crc kubenswrapper[4962]: W1003 14:15:00.723491 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod808fc01e_573f_4a85_bae4_c1864fb86c1f.slice/crio-2b6bd700685a895054bf1430ec9c1ca0217db5517be3524aeb0c1ea02fa1bbdd WatchSource:0}: Error finding container 2b6bd700685a895054bf1430ec9c1ca0217db5517be3524aeb0c1ea02fa1bbdd: Status 404 returned error can't find the container with id 2b6bd700685a895054bf1430ec9c1ca0217db5517be3524aeb0c1ea02fa1bbdd Oct 03 14:15:01 crc kubenswrapper[4962]: I1003 14:15:01.610161 4962 generic.go:334] "Generic (PLEG): container finished" podID="808fc01e-573f-4a85-bae4-c1864fb86c1f" containerID="d7d37d335d55989528a55713d86452cb79ca318050db3a8beb9285cf671ab367" exitCode=0 Oct 03 14:15:01 crc kubenswrapper[4962]: I1003 14:15:01.610235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" event={"ID":"808fc01e-573f-4a85-bae4-c1864fb86c1f","Type":"ContainerDied","Data":"d7d37d335d55989528a55713d86452cb79ca318050db3a8beb9285cf671ab367"} Oct 03 14:15:01 crc kubenswrapper[4962]: I1003 14:15:01.610309 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" event={"ID":"808fc01e-573f-4a85-bae4-c1864fb86c1f","Type":"ContainerStarted","Data":"2b6bd700685a895054bf1430ec9c1ca0217db5517be3524aeb0c1ea02fa1bbdd"} Oct 03 14:15:02 crc kubenswrapper[4962]: I1003 14:15:02.878193 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.022667 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/808fc01e-573f-4a85-bae4-c1864fb86c1f-secret-volume\") pod \"808fc01e-573f-4a85-bae4-c1864fb86c1f\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.022841 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/808fc01e-573f-4a85-bae4-c1864fb86c1f-config-volume\") pod \"808fc01e-573f-4a85-bae4-c1864fb86c1f\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.022882 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtxp2\" (UniqueName: \"kubernetes.io/projected/808fc01e-573f-4a85-bae4-c1864fb86c1f-kube-api-access-jtxp2\") pod \"808fc01e-573f-4a85-bae4-c1864fb86c1f\" (UID: \"808fc01e-573f-4a85-bae4-c1864fb86c1f\") " Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.024722 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/808fc01e-573f-4a85-bae4-c1864fb86c1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "808fc01e-573f-4a85-bae4-c1864fb86c1f" (UID: "808fc01e-573f-4a85-bae4-c1864fb86c1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.031449 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808fc01e-573f-4a85-bae4-c1864fb86c1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "808fc01e-573f-4a85-bae4-c1864fb86c1f" (UID: "808fc01e-573f-4a85-bae4-c1864fb86c1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.031516 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808fc01e-573f-4a85-bae4-c1864fb86c1f-kube-api-access-jtxp2" (OuterVolumeSpecName: "kube-api-access-jtxp2") pod "808fc01e-573f-4a85-bae4-c1864fb86c1f" (UID: "808fc01e-573f-4a85-bae4-c1864fb86c1f"). InnerVolumeSpecName "kube-api-access-jtxp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.125037 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/808fc01e-573f-4a85-bae4-c1864fb86c1f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.125087 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtxp2\" (UniqueName: \"kubernetes.io/projected/808fc01e-573f-4a85-bae4-c1864fb86c1f-kube-api-access-jtxp2\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.125105 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/808fc01e-573f-4a85-bae4-c1864fb86c1f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.463826 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.465607 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.548041 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.628590 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" event={"ID":"808fc01e-573f-4a85-bae4-c1864fb86c1f","Type":"ContainerDied","Data":"2b6bd700685a895054bf1430ec9c1ca0217db5517be3524aeb0c1ea02fa1bbdd"} Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.628619 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.628677 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b6bd700685a895054bf1430ec9c1ca0217db5517be3524aeb0c1ea02fa1bbdd" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.706010 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.794044 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgtd8"] Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.967949 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28"] Oct 03 14:15:03 crc kubenswrapper[4962]: I1003 14:15:03.976467 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324970-lmj28"] Oct 03 14:15:04 crc kubenswrapper[4962]: I1003 14:15:04.227052 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:15:04 crc kubenswrapper[4962]: E1003 14:15:04.227268 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:15:04 crc kubenswrapper[4962]: I1003 14:15:04.236610 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4203666e-afcb-4f11-95ea-10adb4fd4940" path="/var/lib/kubelet/pods/4203666e-afcb-4f11-95ea-10adb4fd4940/volumes" Oct 03 14:15:05 crc kubenswrapper[4962]: I1003 14:15:05.645731 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wgtd8" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerName="registry-server" containerID="cri-o://33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244" gracePeriod=2 Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.084618 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.270810 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-utilities\") pod \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.270938 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qt7\" (UniqueName: \"kubernetes.io/projected/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-kube-api-access-l7qt7\") pod \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.270978 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-catalog-content\") pod \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\" (UID: \"0873f3b1-04c3-4bb9-9634-6b9b69f5af36\") " Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.273090 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-utilities" (OuterVolumeSpecName: "utilities") pod "0873f3b1-04c3-4bb9-9634-6b9b69f5af36" (UID: "0873f3b1-04c3-4bb9-9634-6b9b69f5af36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.276365 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-kube-api-access-l7qt7" (OuterVolumeSpecName: "kube-api-access-l7qt7") pod "0873f3b1-04c3-4bb9-9634-6b9b69f5af36" (UID: "0873f3b1-04c3-4bb9-9634-6b9b69f5af36"). InnerVolumeSpecName "kube-api-access-l7qt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.372663 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.372708 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qt7\" (UniqueName: \"kubernetes.io/projected/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-kube-api-access-l7qt7\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.578999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0873f3b1-04c3-4bb9-9634-6b9b69f5af36" (UID: "0873f3b1-04c3-4bb9-9634-6b9b69f5af36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.656115 4962 generic.go:334] "Generic (PLEG): container finished" podID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerID="33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244" exitCode=0 Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.656269 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgtd8" event={"ID":"0873f3b1-04c3-4bb9-9634-6b9b69f5af36","Type":"ContainerDied","Data":"33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244"} Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.656320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgtd8" event={"ID":"0873f3b1-04c3-4bb9-9634-6b9b69f5af36","Type":"ContainerDied","Data":"a7175844baa866a78ac9b3434d9eaa70ada413077c201df2576402cf0d937767"} Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.656342 4962 scope.go:117] "RemoveContainer" containerID="33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.656426 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgtd8" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.679900 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0873f3b1-04c3-4bb9-9634-6b9b69f5af36-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.691700 4962 scope.go:117] "RemoveContainer" containerID="001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.694748 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgtd8"] Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.702092 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wgtd8"] Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.713904 4962 scope.go:117] "RemoveContainer" containerID="c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.755745 4962 scope.go:117] "RemoveContainer" containerID="33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244" Oct 03 14:15:06 crc kubenswrapper[4962]: E1003 14:15:06.756262 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244\": container with ID starting with 33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244 not found: ID does not exist" containerID="33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.756318 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244"} err="failed to get container status \"33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244\": rpc error: code = NotFound desc = could not find container \"33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244\": container with ID starting with 33ba5c486db4f86867ae04803abd68fdfdfef17fb14b45d2606bc5006021a244 not found: ID does not exist" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.756366 4962 scope.go:117] "RemoveContainer" containerID="001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa" Oct 03 14:15:06 crc kubenswrapper[4962]: E1003 14:15:06.756913 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa\": container with ID starting with 001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa not found: ID does not exist" containerID="001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.756948 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa"} err="failed to get container status \"001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa\": rpc error: code = NotFound desc = could not find container \"001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa\": container with ID starting with 001781bdc5d40f09d7a32c4cc05fa631c48634c9cba6cb50c4eefe052f551afa not found: ID does not exist" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.756969 4962 scope.go:117] "RemoveContainer" containerID="c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf" Oct 03 14:15:06 crc kubenswrapper[4962]: E1003 14:15:06.757260 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf\": container with ID starting with c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf not found: ID does not exist" containerID="c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf" Oct 03 14:15:06 crc kubenswrapper[4962]: I1003 14:15:06.757310 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf"} err="failed to get container status \"c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf\": rpc error: code = NotFound desc = could not find container \"c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf\": container with ID starting with c18e1db3a1048a7978348ecc8ec883e641017c97ef4a2c90d174c71f3a5847cf not found: ID does not exist" Oct 03 14:15:08 crc kubenswrapper[4962]: I1003 14:15:08.234992 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" path="/var/lib/kubelet/pods/0873f3b1-04c3-4bb9-9634-6b9b69f5af36/volumes" Oct 03 14:15:19 crc kubenswrapper[4962]: I1003 14:15:19.227930 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:15:19 crc kubenswrapper[4962]: E1003 14:15:19.229153 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:15:30 crc kubenswrapper[4962]: I1003 14:15:30.227173 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:15:30 crc kubenswrapper[4962]: E1003 14:15:30.227798 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:15:41 crc kubenswrapper[4962]: I1003 14:15:41.227423 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:15:41 crc kubenswrapper[4962]: E1003 14:15:41.228503 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:15:55 crc kubenswrapper[4962]: I1003 14:15:55.227530 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:15:55 crc kubenswrapper[4962]: E1003 14:15:55.228275 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:16:00 crc kubenswrapper[4962]: I1003 14:16:00.435353 4962 scope.go:117] "RemoveContainer" containerID="a5a7730b2ce20725594b02065f1854b49e3adcc3f3a98cda84e108fc72e06a8b" Oct 03 14:16:07 crc kubenswrapper[4962]: I1003 14:16:07.227919 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:16:07 crc kubenswrapper[4962]: E1003 14:16:07.228709 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:16:19 crc kubenswrapper[4962]: I1003 14:16:19.227260 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:16:19 crc kubenswrapper[4962]: E1003 14:16:19.229327 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:16:34 crc kubenswrapper[4962]: I1003 14:16:34.227625 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:16:34 crc kubenswrapper[4962]: E1003 14:16:34.228736 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:16:45 crc kubenswrapper[4962]: I1003 14:16:45.227529 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:16:45 crc kubenswrapper[4962]: E1003 14:16:45.228409 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:17:00 crc kubenswrapper[4962]: I1003 14:17:00.227802 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:17:00 crc kubenswrapper[4962]: E1003 14:17:00.228771 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:17:15 crc kubenswrapper[4962]: I1003 14:17:15.228015 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:17:15 crc kubenswrapper[4962]: E1003 14:17:15.229043 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:17:28 crc kubenswrapper[4962]: I1003 14:17:28.228999 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:17:28 crc kubenswrapper[4962]: E1003 14:17:28.230259 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:17:43 crc kubenswrapper[4962]: I1003 14:17:43.227191 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:17:43 crc kubenswrapper[4962]: E1003 14:17:43.228211 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:17:56 crc kubenswrapper[4962]: I1003 14:17:56.227508 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:17:56 crc kubenswrapper[4962]: E1003 14:17:56.228543 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:18:07 crc kubenswrapper[4962]: I1003 14:18:07.227881 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:18:07 crc kubenswrapper[4962]: E1003 14:18:07.229127 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:18:21 crc kubenswrapper[4962]: I1003 14:18:21.228813 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:18:21 crc kubenswrapper[4962]: E1003 14:18:21.229807 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:18:33 crc kubenswrapper[4962]: I1003 14:18:33.226558 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:18:33 crc kubenswrapper[4962]: E1003 14:18:33.227414 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:18:46 crc kubenswrapper[4962]: I1003 14:18:46.227372 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:18:46 crc kubenswrapper[4962]: E1003 14:18:46.228176 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.471848 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 14:18:47 crc kubenswrapper[4962]: E1003 14:18:47.472142 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerName="extract-content" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.472156 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerName="extract-content" Oct 03 14:18:47 crc kubenswrapper[4962]: E1003 14:18:47.472169 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808fc01e-573f-4a85-bae4-c1864fb86c1f" containerName="collect-profiles" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.472175 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="808fc01e-573f-4a85-bae4-c1864fb86c1f" containerName="collect-profiles" Oct 03 14:18:47 crc kubenswrapper[4962]: E1003 14:18:47.472181 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerName="extract-utilities" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.472188 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerName="extract-utilities" Oct 03 14:18:47 crc kubenswrapper[4962]: E1003 14:18:47.472201 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerName="registry-server" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.472206 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerName="registry-server" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.472354 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0873f3b1-04c3-4bb9-9634-6b9b69f5af36" containerName="registry-server" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.472363 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="808fc01e-573f-4a85-bae4-c1864fb86c1f" containerName="collect-profiles" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.472877 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.474923 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jldbx" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.486427 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.539430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kln7v\" (UniqueName: \"kubernetes.io/projected/0454ec44-3669-4a32-b308-104a30554a7d-kube-api-access-kln7v\") pod \"mariadb-copy-data\" (UID: \"0454ec44-3669-4a32-b308-104a30554a7d\") " pod="openstack/mariadb-copy-data" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.539522 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6d5d633-b306-441b-907c-f25241036ffc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6d5d633-b306-441b-907c-f25241036ffc\") pod \"mariadb-copy-data\" (UID: \"0454ec44-3669-4a32-b308-104a30554a7d\") " pod="openstack/mariadb-copy-data" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.641235 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kln7v\" (UniqueName: \"kubernetes.io/projected/0454ec44-3669-4a32-b308-104a30554a7d-kube-api-access-kln7v\") pod \"mariadb-copy-data\" (UID: \"0454ec44-3669-4a32-b308-104a30554a7d\") " pod="openstack/mariadb-copy-data" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.641323 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6d5d633-b306-441b-907c-f25241036ffc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6d5d633-b306-441b-907c-f25241036ffc\") pod \"mariadb-copy-data\" (UID: \"0454ec44-3669-4a32-b308-104a30554a7d\") " pod="openstack/mariadb-copy-data" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.648060 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.648131 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6d5d633-b306-441b-907c-f25241036ffc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6d5d633-b306-441b-907c-f25241036ffc\") pod \"mariadb-copy-data\" (UID: \"0454ec44-3669-4a32-b308-104a30554a7d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7935c80d6e5b6c2ccac6e1f7dd6264dcb98490737a4dcbc1b737c1a77cb0296d/globalmount\"" pod="openstack/mariadb-copy-data" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.664340 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kln7v\" (UniqueName: \"kubernetes.io/projected/0454ec44-3669-4a32-b308-104a30554a7d-kube-api-access-kln7v\") pod \"mariadb-copy-data\" (UID: \"0454ec44-3669-4a32-b308-104a30554a7d\") " pod="openstack/mariadb-copy-data" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.687548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6d5d633-b306-441b-907c-f25241036ffc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6d5d633-b306-441b-907c-f25241036ffc\") pod \"mariadb-copy-data\" (UID: \"0454ec44-3669-4a32-b308-104a30554a7d\") " pod="openstack/mariadb-copy-data" Oct 03 14:18:47 crc kubenswrapper[4962]: I1003 14:18:47.845136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 14:18:48 crc kubenswrapper[4962]: I1003 14:18:48.357857 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 14:18:48 crc kubenswrapper[4962]: W1003 14:18:48.368718 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0454ec44_3669_4a32_b308_104a30554a7d.slice/crio-55791541aa60fc64af950168669ec56c40ec72566eb38f927a5eb2f3607bf915 WatchSource:0}: Error finding container 55791541aa60fc64af950168669ec56c40ec72566eb38f927a5eb2f3607bf915: Status 404 returned error can't find the container with id 55791541aa60fc64af950168669ec56c40ec72566eb38f927a5eb2f3607bf915 Oct 03 14:18:48 crc kubenswrapper[4962]: I1003 14:18:48.583049 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0454ec44-3669-4a32-b308-104a30554a7d","Type":"ContainerStarted","Data":"deb4c593ffae36b49c46a4fd890ef93463d4535fb5dc7831e387f61057fb267c"} Oct 03 14:18:48 crc kubenswrapper[4962]: I1003 14:18:48.583094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0454ec44-3669-4a32-b308-104a30554a7d","Type":"ContainerStarted","Data":"55791541aa60fc64af950168669ec56c40ec72566eb38f927a5eb2f3607bf915"} Oct 03 14:18:48 crc kubenswrapper[4962]: I1003 14:18:48.605277 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.605257993 podStartE2EDuration="2.605257993s" podCreationTimestamp="2025-10-03 14:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:18:48.600036906 +0000 UTC m=+5337.003934741" watchObservedRunningTime="2025-10-03 14:18:48.605257993 +0000 UTC m=+5337.009155828" Oct 03 14:18:50 crc kubenswrapper[4962]: I1003 14:18:50.289968 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:50 crc kubenswrapper[4962]: I1003 14:18:50.291675 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 14:18:50 crc kubenswrapper[4962]: I1003 14:18:50.299155 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:50 crc kubenswrapper[4962]: I1003 14:18:50.385946 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb6jh\" (UniqueName: \"kubernetes.io/projected/43c7dc04-ae39-46e5-83e7-46291e8e6db7-kube-api-access-bb6jh\") pod \"mariadb-client\" (UID: \"43c7dc04-ae39-46e5-83e7-46291e8e6db7\") " pod="openstack/mariadb-client" Oct 03 14:18:50 crc kubenswrapper[4962]: I1003 14:18:50.487172 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb6jh\" (UniqueName: \"kubernetes.io/projected/43c7dc04-ae39-46e5-83e7-46291e8e6db7-kube-api-access-bb6jh\") pod \"mariadb-client\" (UID: \"43c7dc04-ae39-46e5-83e7-46291e8e6db7\") " pod="openstack/mariadb-client" Oct 03 14:18:50 crc kubenswrapper[4962]: I1003 14:18:50.516808 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb6jh\" (UniqueName: \"kubernetes.io/projected/43c7dc04-ae39-46e5-83e7-46291e8e6db7-kube-api-access-bb6jh\") pod \"mariadb-client\" (UID: \"43c7dc04-ae39-46e5-83e7-46291e8e6db7\") " pod="openstack/mariadb-client" Oct 03 14:18:50 crc kubenswrapper[4962]: I1003 14:18:50.618493 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 14:18:51 crc kubenswrapper[4962]: I1003 14:18:51.042664 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:51 crc kubenswrapper[4962]: W1003 14:18:51.051446 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43c7dc04_ae39_46e5_83e7_46291e8e6db7.slice/crio-31d2c2694198bbbe985c22683a6feaba269f93a8a3e573f0b2d4fb00dc66975d WatchSource:0}: Error finding container 31d2c2694198bbbe985c22683a6feaba269f93a8a3e573f0b2d4fb00dc66975d: Status 404 returned error can't find the container with id 31d2c2694198bbbe985c22683a6feaba269f93a8a3e573f0b2d4fb00dc66975d Oct 03 14:18:51 crc kubenswrapper[4962]: I1003 14:18:51.608381 4962 generic.go:334] "Generic (PLEG): container finished" podID="43c7dc04-ae39-46e5-83e7-46291e8e6db7" containerID="aa293120959a35c6da56ae0dd6f9cf379fde8723b4c11f771d10e57cb7ef494f" exitCode=0 Oct 03 14:18:51 crc kubenswrapper[4962]: I1003 14:18:51.608433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"43c7dc04-ae39-46e5-83e7-46291e8e6db7","Type":"ContainerDied","Data":"aa293120959a35c6da56ae0dd6f9cf379fde8723b4c11f771d10e57cb7ef494f"} Oct 03 14:18:51 crc kubenswrapper[4962]: I1003 14:18:51.608460 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"43c7dc04-ae39-46e5-83e7-46291e8e6db7","Type":"ContainerStarted","Data":"31d2c2694198bbbe985c22683a6feaba269f93a8a3e573f0b2d4fb00dc66975d"} Oct 03 14:18:52 crc kubenswrapper[4962]: I1003 14:18:52.914451 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 14:18:52 crc kubenswrapper[4962]: I1003 14:18:52.944089 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_43c7dc04-ae39-46e5-83e7-46291e8e6db7/mariadb-client/0.log" Oct 03 14:18:52 crc kubenswrapper[4962]: I1003 14:18:52.974966 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:52 crc kubenswrapper[4962]: I1003 14:18:52.981487 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.031300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb6jh\" (UniqueName: \"kubernetes.io/projected/43c7dc04-ae39-46e5-83e7-46291e8e6db7-kube-api-access-bb6jh\") pod \"43c7dc04-ae39-46e5-83e7-46291e8e6db7\" (UID: \"43c7dc04-ae39-46e5-83e7-46291e8e6db7\") " Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.045023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c7dc04-ae39-46e5-83e7-46291e8e6db7-kube-api-access-bb6jh" (OuterVolumeSpecName: "kube-api-access-bb6jh") pod "43c7dc04-ae39-46e5-83e7-46291e8e6db7" (UID: "43c7dc04-ae39-46e5-83e7-46291e8e6db7"). InnerVolumeSpecName "kube-api-access-bb6jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.098707 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:53 crc kubenswrapper[4962]: E1003 14:18:53.099508 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c7dc04-ae39-46e5-83e7-46291e8e6db7" containerName="mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.099530 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c7dc04-ae39-46e5-83e7-46291e8e6db7" containerName="mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.099710 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c7dc04-ae39-46e5-83e7-46291e8e6db7" containerName="mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.100234 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.105023 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.134798 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb6jh\" (UniqueName: \"kubernetes.io/projected/43c7dc04-ae39-46e5-83e7-46291e8e6db7-kube-api-access-bb6jh\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.236695 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kww9\" (UniqueName: \"kubernetes.io/projected/2cf891e9-b54e-4129-a6aa-a48147387b10-kube-api-access-4kww9\") pod \"mariadb-client\" (UID: \"2cf891e9-b54e-4129-a6aa-a48147387b10\") " pod="openstack/mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.338331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kww9\" (UniqueName: \"kubernetes.io/projected/2cf891e9-b54e-4129-a6aa-a48147387b10-kube-api-access-4kww9\") pod \"mariadb-client\" (UID: \"2cf891e9-b54e-4129-a6aa-a48147387b10\") " pod="openstack/mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.354991 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kww9\" (UniqueName: \"kubernetes.io/projected/2cf891e9-b54e-4129-a6aa-a48147387b10-kube-api-access-4kww9\") pod \"mariadb-client\" (UID: \"2cf891e9-b54e-4129-a6aa-a48147387b10\") " pod="openstack/mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.430414 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.640303 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31d2c2694198bbbe985c22683a6feaba269f93a8a3e573f0b2d4fb00dc66975d" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.640600 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.661545 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="43c7dc04-ae39-46e5-83e7-46291e8e6db7" podUID="2cf891e9-b54e-4129-a6aa-a48147387b10" Oct 03 14:18:53 crc kubenswrapper[4962]: I1003 14:18:53.858942 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:53 crc kubenswrapper[4962]: W1003 14:18:53.862059 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cf891e9_b54e_4129_a6aa_a48147387b10.slice/crio-ad1be90e2bbe2d46b32ce0095c571e0a0a9b73e26e3b2a44e9e220c654ce51ad WatchSource:0}: Error finding container ad1be90e2bbe2d46b32ce0095c571e0a0a9b73e26e3b2a44e9e220c654ce51ad: Status 404 returned error can't find the container with id ad1be90e2bbe2d46b32ce0095c571e0a0a9b73e26e3b2a44e9e220c654ce51ad Oct 03 14:18:54 crc kubenswrapper[4962]: I1003 14:18:54.238563 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43c7dc04-ae39-46e5-83e7-46291e8e6db7" path="/var/lib/kubelet/pods/43c7dc04-ae39-46e5-83e7-46291e8e6db7/volumes" Oct 03 14:18:54 crc kubenswrapper[4962]: I1003 14:18:54.651189 4962 generic.go:334] "Generic (PLEG): container finished" podID="2cf891e9-b54e-4129-a6aa-a48147387b10" containerID="8e7b70955f7aeadae2c6ccf70f0750afdddd692335c7ee3cc99f72984105a28c" exitCode=0 Oct 03 14:18:54 crc kubenswrapper[4962]: I1003 14:18:54.651228 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2cf891e9-b54e-4129-a6aa-a48147387b10","Type":"ContainerDied","Data":"8e7b70955f7aeadae2c6ccf70f0750afdddd692335c7ee3cc99f72984105a28c"} Oct 03 14:18:54 crc kubenswrapper[4962]: I1003 14:18:54.651255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2cf891e9-b54e-4129-a6aa-a48147387b10","Type":"ContainerStarted","Data":"ad1be90e2bbe2d46b32ce0095c571e0a0a9b73e26e3b2a44e9e220c654ce51ad"} Oct 03 14:18:55 crc kubenswrapper[4962]: I1003 14:18:55.944914 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 14:18:55 crc kubenswrapper[4962]: I1003 14:18:55.960522 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_2cf891e9-b54e-4129-a6aa-a48147387b10/mariadb-client/0.log" Oct 03 14:18:55 crc kubenswrapper[4962]: I1003 14:18:55.982380 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:55 crc kubenswrapper[4962]: I1003 14:18:55.986304 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 03 14:18:56 crc kubenswrapper[4962]: I1003 14:18:56.076218 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kww9\" (UniqueName: \"kubernetes.io/projected/2cf891e9-b54e-4129-a6aa-a48147387b10-kube-api-access-4kww9\") pod \"2cf891e9-b54e-4129-a6aa-a48147387b10\" (UID: \"2cf891e9-b54e-4129-a6aa-a48147387b10\") " Oct 03 14:18:56 crc kubenswrapper[4962]: I1003 14:18:56.081622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf891e9-b54e-4129-a6aa-a48147387b10-kube-api-access-4kww9" (OuterVolumeSpecName: "kube-api-access-4kww9") pod "2cf891e9-b54e-4129-a6aa-a48147387b10" (UID: "2cf891e9-b54e-4129-a6aa-a48147387b10"). InnerVolumeSpecName "kube-api-access-4kww9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:56 crc kubenswrapper[4962]: I1003 14:18:56.178117 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kww9\" (UniqueName: \"kubernetes.io/projected/2cf891e9-b54e-4129-a6aa-a48147387b10-kube-api-access-4kww9\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:56 crc kubenswrapper[4962]: I1003 14:18:56.235766 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf891e9-b54e-4129-a6aa-a48147387b10" path="/var/lib/kubelet/pods/2cf891e9-b54e-4129-a6aa-a48147387b10/volumes" Oct 03 14:18:56 crc kubenswrapper[4962]: I1003 14:18:56.673691 4962 scope.go:117] "RemoveContainer" containerID="8e7b70955f7aeadae2c6ccf70f0750afdddd692335c7ee3cc99f72984105a28c" Oct 03 14:18:56 crc kubenswrapper[4962]: I1003 14:18:56.673739 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 14:19:01 crc kubenswrapper[4962]: I1003 14:19:01.227537 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:19:01 crc kubenswrapper[4962]: E1003 14:19:01.228350 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:19:14 crc kubenswrapper[4962]: I1003 14:19:14.227312 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:19:14 crc kubenswrapper[4962]: E1003 14:19:14.228388 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:19:25 crc kubenswrapper[4962]: I1003 14:19:25.227923 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:19:25 crc kubenswrapper[4962]: I1003 14:19:25.895409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"66e15d378b389fff3661c27f240d5e7dda70a518dfc3f92072db88df889fc781"} Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.629696 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:19:28 crc kubenswrapper[4962]: E1003 14:19:28.630919 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf891e9-b54e-4129-a6aa-a48147387b10" containerName="mariadb-client" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.630949 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf891e9-b54e-4129-a6aa-a48147387b10" containerName="mariadb-client" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.631225 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf891e9-b54e-4129-a6aa-a48147387b10" containerName="mariadb-client" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.632599 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.634562 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kz4cb" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.635106 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.636673 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.649130 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.650442 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.685867 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.687824 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.704703 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.719616 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.724871 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.795268 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dd05672c-8f16-46aa-bdf3-7294765df421\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd05672c-8f16-46aa-bdf3-7294765df421\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.795334 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l559m\" (UniqueName: \"kubernetes.io/projected/00d0685e-721c-4362-8758-bb6f4d558db1-kube-api-access-l559m\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.795358 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.795418 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbmdh\" (UniqueName: \"kubernetes.io/projected/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-kube-api-access-bbmdh\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.795436 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d3bc37e-a440-40ed-9f6d-c022ace80e8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d3bc37e-a440-40ed-9f6d-c022ace80e8e\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.796723 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnrm4\" (UniqueName: \"kubernetes.io/projected/e678ef37-68d9-467f-81ec-bcd62272c6b2-kube-api-access-bnrm4\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.796879 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e678ef37-68d9-467f-81ec-bcd62272c6b2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.796913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e678ef37-68d9-467f-81ec-bcd62272c6b2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.796934 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-config\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797035 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a08b200d-7c65-4eae-a812-c0e0d46ea9f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a08b200d-7c65-4eae-a812-c0e0d46ea9f3\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797278 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00d0685e-721c-4362-8758-bb6f4d558db1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00d0685e-721c-4362-8758-bb6f4d558db1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797591 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00d0685e-721c-4362-8758-bb6f4d558db1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797673 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d0685e-721c-4362-8758-bb6f4d558db1-config\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e678ef37-68d9-467f-81ec-bcd62272c6b2-config\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797754 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.797809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e678ef37-68d9-467f-81ec-bcd62272c6b2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.817591 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.818911 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.821790 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-kb96m" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.823202 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.823357 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.840142 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.849959 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.851982 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.863437 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.865083 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.873835 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.899495 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnrm4\" (UniqueName: \"kubernetes.io/projected/e678ef37-68d9-467f-81ec-bcd62272c6b2-kube-api-access-bnrm4\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.899768 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e678ef37-68d9-467f-81ec-bcd62272c6b2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.899904 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e678ef37-68d9-467f-81ec-bcd62272c6b2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.899995 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-config\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900104 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a08b200d-7c65-4eae-a812-c0e0d46ea9f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a08b200d-7c65-4eae-a812-c0e0d46ea9f3\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00d0685e-721c-4362-8758-bb6f4d558db1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900411 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00d0685e-721c-4362-8758-bb6f4d558db1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900549 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00d0685e-721c-4362-8758-bb6f4d558db1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900662 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d0685e-721c-4362-8758-bb6f4d558db1-config\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900757 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e678ef37-68d9-467f-81ec-bcd62272c6b2-config\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900847 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900938 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e678ef37-68d9-467f-81ec-bcd62272c6b2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.901053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dd05672c-8f16-46aa-bdf3-7294765df421\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd05672c-8f16-46aa-bdf3-7294765df421\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.901154 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l559m\" (UniqueName: \"kubernetes.io/projected/00d0685e-721c-4362-8758-bb6f4d558db1-kube-api-access-l559m\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.901208 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-config\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.901226 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.901277 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.901328 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbmdh\" (UniqueName: \"kubernetes.io/projected/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-kube-api-access-bbmdh\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.901365 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d3bc37e-a440-40ed-9f6d-c022ace80e8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d3bc37e-a440-40ed-9f6d-c022ace80e8e\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.901555 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e678ef37-68d9-467f-81ec-bcd62272c6b2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.902144 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00d0685e-721c-4362-8758-bb6f4d558db1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.900804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.902629 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e678ef37-68d9-467f-81ec-bcd62272c6b2-config\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.902926 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00d0685e-721c-4362-8758-bb6f4d558db1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.903342 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e678ef37-68d9-467f-81ec-bcd62272c6b2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.903746 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d0685e-721c-4362-8758-bb6f4d558db1-config\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.903937 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.903977 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a08b200d-7c65-4eae-a812-c0e0d46ea9f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a08b200d-7c65-4eae-a812-c0e0d46ea9f3\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/80a0531754c28060f459bb7ea0311854bf10701a8d8f2647bb6dbbfdae784c01/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.903939 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.904386 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.904414 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d3bc37e-a440-40ed-9f6d-c022ace80e8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d3bc37e-a440-40ed-9f6d-c022ace80e8e\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b2826f398c3e3e5aaddbb4303fd6fd0ae49346bb9a21fa0af6261ebbc5348961/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.908963 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00d0685e-721c-4362-8758-bb6f4d558db1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.909529 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.909656 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dd05672c-8f16-46aa-bdf3-7294765df421\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd05672c-8f16-46aa-bdf3-7294765df421\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/351bcef3df9b1a30ff7df39c5224890938f014192aa7ee251ab3d5e99badfdc4/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.923325 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e678ef37-68d9-467f-81ec-bcd62272c6b2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.924286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.928329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbmdh\" (UniqueName: \"kubernetes.io/projected/8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c-kube-api-access-bbmdh\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.929427 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l559m\" (UniqueName: \"kubernetes.io/projected/00d0685e-721c-4362-8758-bb6f4d558db1-kube-api-access-l559m\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.938684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnrm4\" (UniqueName: \"kubernetes.io/projected/e678ef37-68d9-467f-81ec-bcd62272c6b2-kube-api-access-bnrm4\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.944526 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d3bc37e-a440-40ed-9f6d-c022ace80e8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d3bc37e-a440-40ed-9f6d-c022ace80e8e\") pod \"ovsdbserver-nb-0\" (UID: \"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.950155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a08b200d-7c65-4eae-a812-c0e0d46ea9f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a08b200d-7c65-4eae-a812-c0e0d46ea9f3\") pod \"ovsdbserver-nb-2\" (UID: \"e678ef37-68d9-467f-81ec-bcd62272c6b2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.951584 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:28 crc kubenswrapper[4962]: I1003 14:19:28.954376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dd05672c-8f16-46aa-bdf3-7294765df421\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd05672c-8f16-46aa-bdf3-7294765df421\") pod \"ovsdbserver-nb-1\" (UID: \"00d0685e-721c-4362-8758-bb6f4d558db1\") " pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003085 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b64491d9-7298-4635-883b-0e20686dd5a4-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003133 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b64491d9-7298-4635-883b-0e20686dd5a4-config\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003156 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87hgq\" (UniqueName: \"kubernetes.io/projected/e2639e05-a11a-4b8b-8042-462df3d59df7-kube-api-access-87hgq\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003177 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7219fc1-9612-42ac-9c90-68854d135f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7219fc1-9612-42ac-9c90-68854d135f42\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003325 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4a71e365-40f0-4c8f-80ed-9261412fe2a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a71e365-40f0-4c8f-80ed-9261412fe2a4\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2639e05-a11a-4b8b-8042-462df3d59df7-config\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003431 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cef6c4f-828f-4438-82a5-c1d42a7624a8-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003555 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cl6s\" (UniqueName: \"kubernetes.io/projected/5cef6c4f-828f-4438-82a5-c1d42a7624a8-kube-api-access-5cl6s\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003615 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2639e05-a11a-4b8b-8042-462df3d59df7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003652 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64491d9-7298-4635-883b-0e20686dd5a4-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2639e05-a11a-4b8b-8042-462df3d59df7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003698 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cef6c4f-828f-4438-82a5-c1d42a7624a8-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003780 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2639e05-a11a-4b8b-8042-462df3d59df7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003803 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cef6c4f-828f-4438-82a5-c1d42a7624a8-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003844 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cef6c4f-828f-4438-82a5-c1d42a7624a8-config\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003886 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w62tn\" (UniqueName: \"kubernetes.io/projected/b64491d9-7298-4635-883b-0e20686dd5a4-kube-api-access-w62tn\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003933 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ee943d0a-51d8-4f1e-958d-d6b69ec89f45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee943d0a-51d8-4f1e-958d-d6b69ec89f45\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.003969 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b64491d9-7298-4635-883b-0e20686dd5a4-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.007307 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.010977 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.105579 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2639e05-a11a-4b8b-8042-462df3d59df7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.105956 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64491d9-7298-4635-883b-0e20686dd5a4-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106001 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2639e05-a11a-4b8b-8042-462df3d59df7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106023 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cef6c4f-828f-4438-82a5-c1d42a7624a8-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106047 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2639e05-a11a-4b8b-8042-462df3d59df7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cef6c4f-828f-4438-82a5-c1d42a7624a8-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106084 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cef6c4f-828f-4438-82a5-c1d42a7624a8-config\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106104 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w62tn\" (UniqueName: \"kubernetes.io/projected/b64491d9-7298-4635-883b-0e20686dd5a4-kube-api-access-w62tn\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ee943d0a-51d8-4f1e-958d-d6b69ec89f45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee943d0a-51d8-4f1e-958d-d6b69ec89f45\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b64491d9-7298-4635-883b-0e20686dd5a4-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b64491d9-7298-4635-883b-0e20686dd5a4-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106215 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b64491d9-7298-4635-883b-0e20686dd5a4-config\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106254 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7219fc1-9612-42ac-9c90-68854d135f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7219fc1-9612-42ac-9c90-68854d135f42\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106279 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87hgq\" (UniqueName: \"kubernetes.io/projected/e2639e05-a11a-4b8b-8042-462df3d59df7-kube-api-access-87hgq\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106317 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4a71e365-40f0-4c8f-80ed-9261412fe2a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a71e365-40f0-4c8f-80ed-9261412fe2a4\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2639e05-a11a-4b8b-8042-462df3d59df7-config\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106374 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cef6c4f-828f-4438-82a5-c1d42a7624a8-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.106414 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cl6s\" (UniqueName: \"kubernetes.io/projected/5cef6c4f-828f-4438-82a5-c1d42a7624a8-kube-api-access-5cl6s\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.107927 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cef6c4f-828f-4438-82a5-c1d42a7624a8-config\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.108203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2639e05-a11a-4b8b-8042-462df3d59df7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.108806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cef6c4f-828f-4438-82a5-c1d42a7624a8-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.109233 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.109254 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ee943d0a-51d8-4f1e-958d-d6b69ec89f45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee943d0a-51d8-4f1e-958d-d6b69ec89f45\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/63d617a9ce8b9eed59e9ab3ecade09d8f2c6b4e8ef23325d3fcbc8d04e8c396f/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.109604 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b64491d9-7298-4635-883b-0e20686dd5a4-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.109764 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b64491d9-7298-4635-883b-0e20686dd5a4-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.110052 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2639e05-a11a-4b8b-8042-462df3d59df7-config\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.110607 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b64491d9-7298-4635-883b-0e20686dd5a4-config\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.110817 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2639e05-a11a-4b8b-8042-462df3d59df7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.111868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64491d9-7298-4635-883b-0e20686dd5a4-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.112774 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.112804 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4a71e365-40f0-4c8f-80ed-9261412fe2a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a71e365-40f0-4c8f-80ed-9261412fe2a4\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e411a392173d41e382a8aebb22ea943178571531714bf80e57937734c35c32bd/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.112887 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.112967 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7219fc1-9612-42ac-9c90-68854d135f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7219fc1-9612-42ac-9c90-68854d135f42\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d9c8fbe489f2bb37507dcacbc8b311d015ee1edb29e2f5e4a71cc18a29cafb3a/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.116800 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cef6c4f-828f-4438-82a5-c1d42a7624a8-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.119146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2639e05-a11a-4b8b-8042-462df3d59df7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.120229 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cef6c4f-828f-4438-82a5-c1d42a7624a8-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.147398 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cl6s\" (UniqueName: \"kubernetes.io/projected/5cef6c4f-828f-4438-82a5-c1d42a7624a8-kube-api-access-5cl6s\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.148864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w62tn\" (UniqueName: \"kubernetes.io/projected/b64491d9-7298-4635-883b-0e20686dd5a4-kube-api-access-w62tn\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.150471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87hgq\" (UniqueName: \"kubernetes.io/projected/e2639e05-a11a-4b8b-8042-462df3d59df7-kube-api-access-87hgq\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.165037 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7219fc1-9612-42ac-9c90-68854d135f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7219fc1-9612-42ac-9c90-68854d135f42\") pod \"ovsdbserver-sb-1\" (UID: \"5cef6c4f-828f-4438-82a5-c1d42a7624a8\") " pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.166617 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ee943d0a-51d8-4f1e-958d-d6b69ec89f45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee943d0a-51d8-4f1e-958d-d6b69ec89f45\") pod \"ovsdbserver-sb-0\" (UID: \"e2639e05-a11a-4b8b-8042-462df3d59df7\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.173702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4a71e365-40f0-4c8f-80ed-9261412fe2a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a71e365-40f0-4c8f-80ed-9261412fe2a4\") pod \"ovsdbserver-sb-2\" (UID: \"b64491d9-7298-4635-883b-0e20686dd5a4\") " pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.182571 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.191080 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.442827 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.477803 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:19:29 crc kubenswrapper[4962]: W1003 14:19:29.480498 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c1870ea_e7e7_4eb0_9ce9_d6ec1b9be43c.slice/crio-e01cc0fff9d5f62ad89343b0d5aa8bf36dc1111198f72ada9c0150cdafb684dc WatchSource:0}: Error finding container e01cc0fff9d5f62ad89343b0d5aa8bf36dc1111198f72ada9c0150cdafb684dc: Status 404 returned error can't find the container with id e01cc0fff9d5f62ad89343b0d5aa8bf36dc1111198f72ada9c0150cdafb684dc Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.627782 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 14:19:29 crc kubenswrapper[4962]: W1003 14:19:29.628776 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00d0685e_721c_4362_8758_bb6f4d558db1.slice/crio-3982026d89fb74117858c1da63928dc324cd1ce98b760d089c726f1875324e73 WatchSource:0}: Error finding container 3982026d89fb74117858c1da63928dc324cd1ce98b760d089c726f1875324e73: Status 404 returned error can't find the container with id 3982026d89fb74117858c1da63928dc324cd1ce98b760d089c726f1875324e73 Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.742532 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 14:19:29 crc kubenswrapper[4962]: W1003 14:19:29.745521 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb64491d9_7298_4635_883b_0e20686dd5a4.slice/crio-469c08738f54bfd1222bfb47533ca33cfec23e650376ae6d644ce0d40eb63c58 WatchSource:0}: Error finding container 469c08738f54bfd1222bfb47533ca33cfec23e650376ae6d644ce0d40eb63c58: Status 404 returned error can't find the container with id 469c08738f54bfd1222bfb47533ca33cfec23e650376ae6d644ce0d40eb63c58 Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.934087 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b64491d9-7298-4635-883b-0e20686dd5a4","Type":"ContainerStarted","Data":"b332ee100d6f660a44a9eb7ed89acebf34808778fdb0ee5968d2c7bb4f61bd87"} Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.934796 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b64491d9-7298-4635-883b-0e20686dd5a4","Type":"ContainerStarted","Data":"469c08738f54bfd1222bfb47533ca33cfec23e650376ae6d644ce0d40eb63c58"} Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.938886 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"00d0685e-721c-4362-8758-bb6f4d558db1","Type":"ContainerStarted","Data":"4b131a97988f691d5c9682fa2abfe0a6eafed027bbeeab7237b266d4bc777471"} Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.939074 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"00d0685e-721c-4362-8758-bb6f4d558db1","Type":"ContainerStarted","Data":"3982026d89fb74117858c1da63928dc324cd1ce98b760d089c726f1875324e73"} Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.942279 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c","Type":"ContainerStarted","Data":"7939c01fe8707733c4ec228b464da3f93ed063a08a2686bc5340f1231a87c087"} Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.942324 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c","Type":"ContainerStarted","Data":"250ee057a88f2db343c4da059b03aa47a94fd70152c195663e79c905d4821677"} Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.942335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c","Type":"ContainerStarted","Data":"e01cc0fff9d5f62ad89343b0d5aa8bf36dc1111198f72ada9c0150cdafb684dc"} Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.970613 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=2.970595415 podStartE2EDuration="2.970595415s" podCreationTimestamp="2025-10-03 14:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:29.961270871 +0000 UTC m=+5378.365168706" watchObservedRunningTime="2025-10-03 14:19:29.970595415 +0000 UTC m=+5378.374493250" Oct 03 14:19:29 crc kubenswrapper[4962]: I1003 14:19:29.970815 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:19:30 crc kubenswrapper[4962]: W1003 14:19:30.382189 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cef6c4f_828f_4438_82a5_c1d42a7624a8.slice/crio-5a286dcf92d5844935ef1c0372eaeefd49ae8cc40bac13bb4d2866850b822b44 WatchSource:0}: Error finding container 5a286dcf92d5844935ef1c0372eaeefd49ae8cc40bac13bb4d2866850b822b44: Status 404 returned error can't find the container with id 5a286dcf92d5844935ef1c0372eaeefd49ae8cc40bac13bb4d2866850b822b44 Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.382249 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.723798 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 14:19:30 crc kubenswrapper[4962]: W1003 14:19:30.728450 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode678ef37_68d9_467f_81ec_bcd62272c6b2.slice/crio-6b8695e67f92fedadb96a720445912c147c7339322fbfa78c0d6d15d3ab66b9b WatchSource:0}: Error finding container 6b8695e67f92fedadb96a720445912c147c7339322fbfa78c0d6d15d3ab66b9b: Status 404 returned error can't find the container with id 6b8695e67f92fedadb96a720445912c147c7339322fbfa78c0d6d15d3ab66b9b Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.954911 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e678ef37-68d9-467f-81ec-bcd62272c6b2","Type":"ContainerStarted","Data":"1d1977cce5335826c7b75d35c92a052b683f012e01f24c0f9cd66c2741a4831b"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.955249 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e678ef37-68d9-467f-81ec-bcd62272c6b2","Type":"ContainerStarted","Data":"6b8695e67f92fedadb96a720445912c147c7339322fbfa78c0d6d15d3ab66b9b"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.959519 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5cef6c4f-828f-4438-82a5-c1d42a7624a8","Type":"ContainerStarted","Data":"94ddb27deabfcfcd314169661d2faf9516694e9514d9cbb09963d87cc359ddc7"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.959568 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5cef6c4f-828f-4438-82a5-c1d42a7624a8","Type":"ContainerStarted","Data":"0d6b5daa707b4f9d950b32b0cd3f41ce4d1aa87d4f353ee30f891c7ef15efefd"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.959582 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5cef6c4f-828f-4438-82a5-c1d42a7624a8","Type":"ContainerStarted","Data":"5a286dcf92d5844935ef1c0372eaeefd49ae8cc40bac13bb4d2866850b822b44"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.970794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e2639e05-a11a-4b8b-8042-462df3d59df7","Type":"ContainerStarted","Data":"707bc9a6ba2eed2b16f10570e65863fc4178a8ab2de53c8d5da95bf65a5164e5"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.970866 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e2639e05-a11a-4b8b-8042-462df3d59df7","Type":"ContainerStarted","Data":"91e4584e8cc5baf533b0d3c6cee21ffe98f1abfb2df6faed00e54ea555f4d0c2"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.970887 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e2639e05-a11a-4b8b-8042-462df3d59df7","Type":"ContainerStarted","Data":"ac385069ca0af1ee502e4cd0787409cf63a64889d744529e2ea2ade6f9d07465"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.974744 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b64491d9-7298-4635-883b-0e20686dd5a4","Type":"ContainerStarted","Data":"db5352f123336a3ef6514f99e04f0520529aa3fb8c6bca28f4b082e4963de419"} Oct 03 14:19:30 crc kubenswrapper[4962]: I1003 14:19:30.977432 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"00d0685e-721c-4362-8758-bb6f4d558db1","Type":"ContainerStarted","Data":"ebb857882a9e811e530c496014e2f8aa79c8f747e3e4fcb833fe99eecce48ca7"} Oct 03 14:19:31 crc kubenswrapper[4962]: I1003 14:19:31.006932 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.006908212 podStartE2EDuration="4.006908212s" podCreationTimestamp="2025-10-03 14:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:31.000161825 +0000 UTC m=+5379.404059700" watchObservedRunningTime="2025-10-03 14:19:31.006908212 +0000 UTC m=+5379.410806047" Oct 03 14:19:31 crc kubenswrapper[4962]: I1003 14:19:31.028293 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.02827451 podStartE2EDuration="4.02827451s" podCreationTimestamp="2025-10-03 14:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:31.025347684 +0000 UTC m=+5379.429245529" watchObservedRunningTime="2025-10-03 14:19:31.02827451 +0000 UTC m=+5379.432172365" Oct 03 14:19:31 crc kubenswrapper[4962]: I1003 14:19:31.048845 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.048828558 podStartE2EDuration="4.048828558s" podCreationTimestamp="2025-10-03 14:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:31.046965639 +0000 UTC m=+5379.450863514" watchObservedRunningTime="2025-10-03 14:19:31.048828558 +0000 UTC m=+5379.452726393" Oct 03 14:19:31 crc kubenswrapper[4962]: I1003 14:19:31.069831 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.069812516 podStartE2EDuration="4.069812516s" podCreationTimestamp="2025-10-03 14:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:31.064726933 +0000 UTC m=+5379.468624788" watchObservedRunningTime="2025-10-03 14:19:31.069812516 +0000 UTC m=+5379.473710351" Oct 03 14:19:31 crc kubenswrapper[4962]: I1003 14:19:31.951936 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:31 crc kubenswrapper[4962]: I1003 14:19:31.993600 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e678ef37-68d9-467f-81ec-bcd62272c6b2","Type":"ContainerStarted","Data":"99b779f66cd80ba3e969ffe7495c851893b913def9797509c997f3904cf37d2f"} Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.007801 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.012235 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.022350 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=5.022324402 podStartE2EDuration="5.022324402s" podCreationTimestamp="2025-10-03 14:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:32.01383799 +0000 UTC m=+5380.417735885" watchObservedRunningTime="2025-10-03 14:19:32.022324402 +0000 UTC m=+5380.426222267" Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.053469 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.183030 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.191251 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.221414 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.443236 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:32 crc kubenswrapper[4962]: I1003 14:19:32.999728 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:33 crc kubenswrapper[4962]: I1003 14:19:33.000406 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:33 crc kubenswrapper[4962]: I1003 14:19:33.952578 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.007804 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.049737 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.058416 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.191259 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.237481 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64848558ff-dk558"] Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.238914 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.243213 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.245762 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-dk558"] Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.296991 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-config\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.297424 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-dns-svc\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.297456 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncd72\" (UniqueName: \"kubernetes.io/projected/41470ea5-a135-47b9-a999-1fcbd4a4d737-kube-api-access-ncd72\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.297487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.364261 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-dk558"] Oct 03 14:19:34 crc kubenswrapper[4962]: E1003 14:19:34.364908 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-ncd72 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-64848558ff-dk558" podUID="41470ea5-a135-47b9-a999-1fcbd4a4d737" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.390563 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76c87b4597-2bjln"] Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.392245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.394898 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.399452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-dns-svc\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.399500 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncd72\" (UniqueName: \"kubernetes.io/projected/41470ea5-a135-47b9-a999-1fcbd4a4d737-kube-api-access-ncd72\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.399529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.399630 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-config\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.400535 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-config\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.401161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-dns-svc\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.401522 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c87b4597-2bjln"] Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.401688 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.430497 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncd72\" (UniqueName: \"kubernetes.io/projected/41470ea5-a135-47b9-a999-1fcbd4a4d737-kube-api-access-ncd72\") pod \"dnsmasq-dns-64848558ff-dk558\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.443970 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.501790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-config\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.501914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-sb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.502045 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-nb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.502417 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-dns-svc\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.502500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgrrb\" (UniqueName: \"kubernetes.io/projected/de03c973-dd77-4560-975d-5bcc19732dc2-kube-api-access-wgrrb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.603821 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-nb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.603886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-dns-svc\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.603916 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgrrb\" (UniqueName: \"kubernetes.io/projected/de03c973-dd77-4560-975d-5bcc19732dc2-kube-api-access-wgrrb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.603992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-config\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.604045 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-sb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.604780 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-nb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.604951 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-sb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.605399 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-config\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.605800 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-dns-svc\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.629774 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgrrb\" (UniqueName: \"kubernetes.io/projected/de03c973-dd77-4560-975d-5bcc19732dc2-kube-api-access-wgrrb\") pod \"dnsmasq-dns-76c87b4597-2bjln\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.709283 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:34 crc kubenswrapper[4962]: I1003 14:19:34.992350 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.031095 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.040447 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.043934 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.072230 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.124254 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-config\") pod \"41470ea5-a135-47b9-a999-1fcbd4a4d737\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.124329 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-ovsdbserver-nb\") pod \"41470ea5-a135-47b9-a999-1fcbd4a4d737\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.124389 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-dns-svc\") pod \"41470ea5-a135-47b9-a999-1fcbd4a4d737\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.124433 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncd72\" (UniqueName: \"kubernetes.io/projected/41470ea5-a135-47b9-a999-1fcbd4a4d737-kube-api-access-ncd72\") pod \"41470ea5-a135-47b9-a999-1fcbd4a4d737\" (UID: \"41470ea5-a135-47b9-a999-1fcbd4a4d737\") " Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.124893 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-config" (OuterVolumeSpecName: "config") pod "41470ea5-a135-47b9-a999-1fcbd4a4d737" (UID: "41470ea5-a135-47b9-a999-1fcbd4a4d737"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.125015 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.125710 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "41470ea5-a135-47b9-a999-1fcbd4a4d737" (UID: "41470ea5-a135-47b9-a999-1fcbd4a4d737"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.125933 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41470ea5-a135-47b9-a999-1fcbd4a4d737" (UID: "41470ea5-a135-47b9-a999-1fcbd4a4d737"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.132996 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41470ea5-a135-47b9-a999-1fcbd4a4d737-kube-api-access-ncd72" (OuterVolumeSpecName: "kube-api-access-ncd72") pod "41470ea5-a135-47b9-a999-1fcbd4a4d737" (UID: "41470ea5-a135-47b9-a999-1fcbd4a4d737"). InnerVolumeSpecName "kube-api-access-ncd72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.159968 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c87b4597-2bjln"] Oct 03 14:19:35 crc kubenswrapper[4962]: W1003 14:19:35.169686 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde03c973_dd77_4560_975d_5bcc19732dc2.slice/crio-c34cc3dbfa37846d82a501699a7d0c4c02a485ec3e365268c4f1ea7972ade419 WatchSource:0}: Error finding container c34cc3dbfa37846d82a501699a7d0c4c02a485ec3e365268c4f1ea7972ade419: Status 404 returned error can't find the container with id c34cc3dbfa37846d82a501699a7d0c4c02a485ec3e365268c4f1ea7972ade419 Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.226910 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.226975 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41470ea5-a135-47b9-a999-1fcbd4a4d737-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.226993 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncd72\" (UniqueName: \"kubernetes.io/projected/41470ea5-a135-47b9-a999-1fcbd4a4d737-kube-api-access-ncd72\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.235310 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.280269 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.482864 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:35 crc kubenswrapper[4962]: I1003 14:19:35.520586 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 14:19:36 crc kubenswrapper[4962]: I1003 14:19:36.042339 4962 generic.go:334] "Generic (PLEG): container finished" podID="de03c973-dd77-4560-975d-5bcc19732dc2" containerID="777822b7a05c100043161800e0ec18b922130878bff5a8e640f5f851919eefd6" exitCode=0 Oct 03 14:19:36 crc kubenswrapper[4962]: I1003 14:19:36.042430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" event={"ID":"de03c973-dd77-4560-975d-5bcc19732dc2","Type":"ContainerDied","Data":"777822b7a05c100043161800e0ec18b922130878bff5a8e640f5f851919eefd6"} Oct 03 14:19:36 crc kubenswrapper[4962]: I1003 14:19:36.042481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" event={"ID":"de03c973-dd77-4560-975d-5bcc19732dc2","Type":"ContainerStarted","Data":"c34cc3dbfa37846d82a501699a7d0c4c02a485ec3e365268c4f1ea7972ade419"} Oct 03 14:19:36 crc kubenswrapper[4962]: I1003 14:19:36.042706 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-dk558" Oct 03 14:19:36 crc kubenswrapper[4962]: I1003 14:19:36.123200 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 03 14:19:36 crc kubenswrapper[4962]: I1003 14:19:36.265309 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-dk558"] Oct 03 14:19:36 crc kubenswrapper[4962]: I1003 14:19:36.265339 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-dk558"] Oct 03 14:19:37 crc kubenswrapper[4962]: I1003 14:19:37.053156 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" event={"ID":"de03c973-dd77-4560-975d-5bcc19732dc2","Type":"ContainerStarted","Data":"0fca765c7236442e03c45c96d908ffafb0a7287b43564182027ae691cf43ec71"} Oct 03 14:19:37 crc kubenswrapper[4962]: I1003 14:19:37.053671 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:37 crc kubenswrapper[4962]: I1003 14:19:37.070882 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" podStartSLOduration=3.070862627 podStartE2EDuration="3.070862627s" podCreationTimestamp="2025-10-03 14:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:37.070109157 +0000 UTC m=+5385.474006992" watchObservedRunningTime="2025-10-03 14:19:37.070862627 +0000 UTC m=+5385.474760472" Oct 03 14:19:38 crc kubenswrapper[4962]: I1003 14:19:38.238802 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41470ea5-a135-47b9-a999-1fcbd4a4d737" path="/var/lib/kubelet/pods/41470ea5-a135-47b9-a999-1fcbd4a4d737/volumes" Oct 03 14:19:38 crc kubenswrapper[4962]: I1003 14:19:38.851458 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 03 14:19:38 crc kubenswrapper[4962]: I1003 14:19:38.852537 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 14:19:38 crc kubenswrapper[4962]: I1003 14:19:38.854895 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 03 14:19:38 crc kubenswrapper[4962]: I1003 14:19:38.856696 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.002952 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1125d5ba-b7ab-4cf8-a170-c4dd98db7a40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1125d5ba-b7ab-4cf8-a170-c4dd98db7a40\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.003784 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/ffd6efcb-6de8-451d-be3a-b5af7aa5f986-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.003937 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkxpz\" (UniqueName: \"kubernetes.io/projected/ffd6efcb-6de8-451d-be3a-b5af7aa5f986-kube-api-access-fkxpz\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.105458 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/ffd6efcb-6de8-451d-be3a-b5af7aa5f986-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.105798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkxpz\" (UniqueName: \"kubernetes.io/projected/ffd6efcb-6de8-451d-be3a-b5af7aa5f986-kube-api-access-fkxpz\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.106006 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1125d5ba-b7ab-4cf8-a170-c4dd98db7a40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1125d5ba-b7ab-4cf8-a170-c4dd98db7a40\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.110710 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.110758 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1125d5ba-b7ab-4cf8-a170-c4dd98db7a40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1125d5ba-b7ab-4cf8-a170-c4dd98db7a40\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4bd7c6b1a299040cbbf0624c0d5895ecd2bd635adde4f67b2383d8c9e02ebaf0/globalmount\"" pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.111662 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/ffd6efcb-6de8-451d-be3a-b5af7aa5f986-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.123765 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkxpz\" (UniqueName: \"kubernetes.io/projected/ffd6efcb-6de8-451d-be3a-b5af7aa5f986-kube-api-access-fkxpz\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.139631 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1125d5ba-b7ab-4cf8-a170-c4dd98db7a40\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1125d5ba-b7ab-4cf8-a170-c4dd98db7a40\") pod \"ovn-copy-data\" (UID: \"ffd6efcb-6de8-451d-be3a-b5af7aa5f986\") " pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.212485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 14:19:39 crc kubenswrapper[4962]: W1003 14:19:39.683504 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffd6efcb_6de8_451d_be3a_b5af7aa5f986.slice/crio-66d2415283c6b045538067a5e5867c7065a44decaa790d70cb3a1c8e84a24399 WatchSource:0}: Error finding container 66d2415283c6b045538067a5e5867c7065a44decaa790d70cb3a1c8e84a24399: Status 404 returned error can't find the container with id 66d2415283c6b045538067a5e5867c7065a44decaa790d70cb3a1c8e84a24399 Oct 03 14:19:39 crc kubenswrapper[4962]: I1003 14:19:39.689832 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 14:19:40 crc kubenswrapper[4962]: I1003 14:19:40.079266 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"ffd6efcb-6de8-451d-be3a-b5af7aa5f986","Type":"ContainerStarted","Data":"d29a0045258b6cee9f60392c0ef9cc07e3c1623b32827db6e0c5a46ffa603b19"} Oct 03 14:19:40 crc kubenswrapper[4962]: I1003 14:19:40.079331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"ffd6efcb-6de8-451d-be3a-b5af7aa5f986","Type":"ContainerStarted","Data":"66d2415283c6b045538067a5e5867c7065a44decaa790d70cb3a1c8e84a24399"} Oct 03 14:19:40 crc kubenswrapper[4962]: I1003 14:19:40.094776 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.094754703 podStartE2EDuration="3.094754703s" podCreationTimestamp="2025-10-03 14:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:40.090789469 +0000 UTC m=+5388.494687324" watchObservedRunningTime="2025-10-03 14:19:40.094754703 +0000 UTC m=+5388.498652538" Oct 03 14:19:44 crc kubenswrapper[4962]: I1003 14:19:44.729248 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:19:44 crc kubenswrapper[4962]: I1003 14:19:44.800298 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-kphgc"] Oct 03 14:19:44 crc kubenswrapper[4962]: I1003 14:19:44.800671 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" podUID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" containerName="dnsmasq-dns" containerID="cri-o://80bcd65ea6c45d1def3c96932b1e02514f17c074a35bf9e6c7ae7c785308d78f" gracePeriod=10 Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.144216 4962 generic.go:334] "Generic (PLEG): container finished" podID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" containerID="80bcd65ea6c45d1def3c96932b1e02514f17c074a35bf9e6c7ae7c785308d78f" exitCode=0 Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.144527 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" event={"ID":"fa9e70ee-665d-4c2c-839d-9ca4de39ad16","Type":"ContainerDied","Data":"80bcd65ea6c45d1def3c96932b1e02514f17c074a35bf9e6c7ae7c785308d78f"} Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.154522 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.159572 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.163813 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.164453 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-kbnbj" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.164578 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.173442 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.235972 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abc499fc-1616-4f38-b181-7c26bb38b71a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.236116 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhxp\" (UniqueName: \"kubernetes.io/projected/abc499fc-1616-4f38-b181-7c26bb38b71a-kube-api-access-tmhxp\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.236251 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc499fc-1616-4f38-b181-7c26bb38b71a-config\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.236370 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc499fc-1616-4f38-b181-7c26bb38b71a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.236468 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abc499fc-1616-4f38-b181-7c26bb38b71a-scripts\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.285066 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.337507 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc499fc-1616-4f38-b181-7c26bb38b71a-config\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.337585 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc499fc-1616-4f38-b181-7c26bb38b71a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.337620 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abc499fc-1616-4f38-b181-7c26bb38b71a-scripts\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.337716 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abc499fc-1616-4f38-b181-7c26bb38b71a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.337740 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhxp\" (UniqueName: \"kubernetes.io/projected/abc499fc-1616-4f38-b181-7c26bb38b71a-kube-api-access-tmhxp\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.338436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abc499fc-1616-4f38-b181-7c26bb38b71a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.338777 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abc499fc-1616-4f38-b181-7c26bb38b71a-scripts\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.339278 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc499fc-1616-4f38-b181-7c26bb38b71a-config\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.344916 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc499fc-1616-4f38-b181-7c26bb38b71a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.359099 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhxp\" (UniqueName: \"kubernetes.io/projected/abc499fc-1616-4f38-b181-7c26bb38b71a-kube-api-access-tmhxp\") pod \"ovn-northd-0\" (UID: \"abc499fc-1616-4f38-b181-7c26bb38b71a\") " pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.439262 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-config\") pod \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.439569 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-dns-svc\") pod \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.439604 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgmvw\" (UniqueName: \"kubernetes.io/projected/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-kube-api-access-zgmvw\") pod \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\" (UID: \"fa9e70ee-665d-4c2c-839d-9ca4de39ad16\") " Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.442988 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-kube-api-access-zgmvw" (OuterVolumeSpecName: "kube-api-access-zgmvw") pod "fa9e70ee-665d-4c2c-839d-9ca4de39ad16" (UID: "fa9e70ee-665d-4c2c-839d-9ca4de39ad16"). InnerVolumeSpecName "kube-api-access-zgmvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.472314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa9e70ee-665d-4c2c-839d-9ca4de39ad16" (UID: "fa9e70ee-665d-4c2c-839d-9ca4de39ad16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.474451 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-config" (OuterVolumeSpecName: "config") pod "fa9e70ee-665d-4c2c-839d-9ca4de39ad16" (UID: "fa9e70ee-665d-4c2c-839d-9ca4de39ad16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.486523 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.542114 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.542148 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgmvw\" (UniqueName: \"kubernetes.io/projected/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-kube-api-access-zgmvw\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.542159 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9e70ee-665d-4c2c-839d-9ca4de39ad16-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4962]: I1003 14:19:45.961481 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:19:45 crc kubenswrapper[4962]: W1003 14:19:45.963712 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabc499fc_1616_4f38_b181_7c26bb38b71a.slice/crio-2ab95cada8f1c34eea4971219594deb10901e804f609a9e2049cd8a85e2aa693 WatchSource:0}: Error finding container 2ab95cada8f1c34eea4971219594deb10901e804f609a9e2049cd8a85e2aa693: Status 404 returned error can't find the container with id 2ab95cada8f1c34eea4971219594deb10901e804f609a9e2049cd8a85e2aa693 Oct 03 14:19:46 crc kubenswrapper[4962]: I1003 14:19:46.153275 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"abc499fc-1616-4f38-b181-7c26bb38b71a","Type":"ContainerStarted","Data":"2ab95cada8f1c34eea4971219594deb10901e804f609a9e2049cd8a85e2aa693"} Oct 03 14:19:46 crc kubenswrapper[4962]: I1003 14:19:46.156150 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" event={"ID":"fa9e70ee-665d-4c2c-839d-9ca4de39ad16","Type":"ContainerDied","Data":"6b70ae36d4d29f562bb0c7b7b94e1446a9de784db46dc2a5ee13f45bff45a63d"} Oct 03 14:19:46 crc kubenswrapper[4962]: I1003 14:19:46.156184 4962 scope.go:117] "RemoveContainer" containerID="80bcd65ea6c45d1def3c96932b1e02514f17c074a35bf9e6c7ae7c785308d78f" Oct 03 14:19:46 crc kubenswrapper[4962]: I1003 14:19:46.156295 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-kphgc" Oct 03 14:19:46 crc kubenswrapper[4962]: I1003 14:19:46.182909 4962 scope.go:117] "RemoveContainer" containerID="6f26dd114c67bef234c8de8d46eb96c4eadf2eb30286b1c83308105c1f124f9e" Oct 03 14:19:46 crc kubenswrapper[4962]: I1003 14:19:46.186845 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-kphgc"] Oct 03 14:19:46 crc kubenswrapper[4962]: I1003 14:19:46.196346 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-kphgc"] Oct 03 14:19:46 crc kubenswrapper[4962]: I1003 14:19:46.238622 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" path="/var/lib/kubelet/pods/fa9e70ee-665d-4c2c-839d-9ca4de39ad16/volumes" Oct 03 14:19:47 crc kubenswrapper[4962]: I1003 14:19:47.166073 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"abc499fc-1616-4f38-b181-7c26bb38b71a","Type":"ContainerStarted","Data":"c7c6fa61b2ae6dc7cfeb0a9d20350d35f99384051149888f4444c4502ff7ac24"} Oct 03 14:19:47 crc kubenswrapper[4962]: I1003 14:19:47.166321 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 14:19:47 crc kubenswrapper[4962]: I1003 14:19:47.166332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"abc499fc-1616-4f38-b181-7c26bb38b71a","Type":"ContainerStarted","Data":"33dbe79a1c8bce1573194b2b28fcd33361913ca075e1b7974cf8bfc0432705f3"} Oct 03 14:19:47 crc kubenswrapper[4962]: I1003 14:19:47.186018 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.185993881 podStartE2EDuration="2.185993881s" podCreationTimestamp="2025-10-03 14:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:47.184863611 +0000 UTC m=+5395.588761476" watchObservedRunningTime="2025-10-03 14:19:47.185993881 +0000 UTC m=+5395.589891716" Oct 03 14:19:49 crc kubenswrapper[4962]: I1003 14:19:49.951880 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wkp9s"] Oct 03 14:19:49 crc kubenswrapper[4962]: E1003 14:19:49.952557 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" containerName="init" Oct 03 14:19:49 crc kubenswrapper[4962]: I1003 14:19:49.952576 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" containerName="init" Oct 03 14:19:49 crc kubenswrapper[4962]: E1003 14:19:49.952590 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" containerName="dnsmasq-dns" Oct 03 14:19:49 crc kubenswrapper[4962]: I1003 14:19:49.952598 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" containerName="dnsmasq-dns" Oct 03 14:19:49 crc kubenswrapper[4962]: I1003 14:19:49.952770 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9e70ee-665d-4c2c-839d-9ca4de39ad16" containerName="dnsmasq-dns" Oct 03 14:19:49 crc kubenswrapper[4962]: I1003 14:19:49.953327 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wkp9s" Oct 03 14:19:49 crc kubenswrapper[4962]: I1003 14:19:49.963668 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wkp9s"] Oct 03 14:19:50 crc kubenswrapper[4962]: I1003 14:19:50.014080 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpm8w\" (UniqueName: \"kubernetes.io/projected/6c239b20-5824-46cc-8500-27c9e8e69e82-kube-api-access-qpm8w\") pod \"keystone-db-create-wkp9s\" (UID: \"6c239b20-5824-46cc-8500-27c9e8e69e82\") " pod="openstack/keystone-db-create-wkp9s" Oct 03 14:19:50 crc kubenswrapper[4962]: I1003 14:19:50.115491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpm8w\" (UniqueName: \"kubernetes.io/projected/6c239b20-5824-46cc-8500-27c9e8e69e82-kube-api-access-qpm8w\") pod \"keystone-db-create-wkp9s\" (UID: \"6c239b20-5824-46cc-8500-27c9e8e69e82\") " pod="openstack/keystone-db-create-wkp9s" Oct 03 14:19:50 crc kubenswrapper[4962]: I1003 14:19:50.136223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpm8w\" (UniqueName: \"kubernetes.io/projected/6c239b20-5824-46cc-8500-27c9e8e69e82-kube-api-access-qpm8w\") pod \"keystone-db-create-wkp9s\" (UID: \"6c239b20-5824-46cc-8500-27c9e8e69e82\") " pod="openstack/keystone-db-create-wkp9s" Oct 03 14:19:50 crc kubenswrapper[4962]: I1003 14:19:50.283180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wkp9s" Oct 03 14:19:50 crc kubenswrapper[4962]: I1003 14:19:50.715357 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wkp9s"] Oct 03 14:19:50 crc kubenswrapper[4962]: W1003 14:19:50.716492 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c239b20_5824_46cc_8500_27c9e8e69e82.slice/crio-ff831289f1a6e435098886ba12925bdf1e91f06c6c730474fa88d666f616fabc WatchSource:0}: Error finding container ff831289f1a6e435098886ba12925bdf1e91f06c6c730474fa88d666f616fabc: Status 404 returned error can't find the container with id ff831289f1a6e435098886ba12925bdf1e91f06c6c730474fa88d666f616fabc Oct 03 14:19:51 crc kubenswrapper[4962]: I1003 14:19:51.197382 4962 generic.go:334] "Generic (PLEG): container finished" podID="6c239b20-5824-46cc-8500-27c9e8e69e82" containerID="30afdada302e7ca15b11d4303e4a1291ebd138839adb0a8f18f2f330f894e843" exitCode=0 Oct 03 14:19:51 crc kubenswrapper[4962]: I1003 14:19:51.197429 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wkp9s" event={"ID":"6c239b20-5824-46cc-8500-27c9e8e69e82","Type":"ContainerDied","Data":"30afdada302e7ca15b11d4303e4a1291ebd138839adb0a8f18f2f330f894e843"} Oct 03 14:19:51 crc kubenswrapper[4962]: I1003 14:19:51.197460 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wkp9s" event={"ID":"6c239b20-5824-46cc-8500-27c9e8e69e82","Type":"ContainerStarted","Data":"ff831289f1a6e435098886ba12925bdf1e91f06c6c730474fa88d666f616fabc"} Oct 03 14:19:52 crc kubenswrapper[4962]: I1003 14:19:52.497816 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wkp9s" Oct 03 14:19:52 crc kubenswrapper[4962]: I1003 14:19:52.550061 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpm8w\" (UniqueName: \"kubernetes.io/projected/6c239b20-5824-46cc-8500-27c9e8e69e82-kube-api-access-qpm8w\") pod \"6c239b20-5824-46cc-8500-27c9e8e69e82\" (UID: \"6c239b20-5824-46cc-8500-27c9e8e69e82\") " Oct 03 14:19:52 crc kubenswrapper[4962]: I1003 14:19:52.556070 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c239b20-5824-46cc-8500-27c9e8e69e82-kube-api-access-qpm8w" (OuterVolumeSpecName: "kube-api-access-qpm8w") pod "6c239b20-5824-46cc-8500-27c9e8e69e82" (UID: "6c239b20-5824-46cc-8500-27c9e8e69e82"). InnerVolumeSpecName "kube-api-access-qpm8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:52 crc kubenswrapper[4962]: I1003 14:19:52.652698 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpm8w\" (UniqueName: \"kubernetes.io/projected/6c239b20-5824-46cc-8500-27c9e8e69e82-kube-api-access-qpm8w\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:53 crc kubenswrapper[4962]: I1003 14:19:53.234709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wkp9s" event={"ID":"6c239b20-5824-46cc-8500-27c9e8e69e82","Type":"ContainerDied","Data":"ff831289f1a6e435098886ba12925bdf1e91f06c6c730474fa88d666f616fabc"} Oct 03 14:19:53 crc kubenswrapper[4962]: I1003 14:19:53.234949 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff831289f1a6e435098886ba12925bdf1e91f06c6c730474fa88d666f616fabc" Oct 03 14:19:53 crc kubenswrapper[4962]: I1003 14:19:53.234776 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wkp9s" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.051596 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c495-account-create-92jpw"] Oct 03 14:20:00 crc kubenswrapper[4962]: E1003 14:20:00.052686 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c239b20-5824-46cc-8500-27c9e8e69e82" containerName="mariadb-database-create" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.052712 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c239b20-5824-46cc-8500-27c9e8e69e82" containerName="mariadb-database-create" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.053028 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c239b20-5824-46cc-8500-27c9e8e69e82" containerName="mariadb-database-create" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.053912 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c495-account-create-92jpw" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.058558 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.065777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2nqk\" (UniqueName: \"kubernetes.io/projected/ac2133e1-e19a-41ea-9a20-8653fb57c519-kube-api-access-w2nqk\") pod \"keystone-c495-account-create-92jpw\" (UID: \"ac2133e1-e19a-41ea-9a20-8653fb57c519\") " pod="openstack/keystone-c495-account-create-92jpw" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.071825 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c495-account-create-92jpw"] Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.166677 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2nqk\" (UniqueName: \"kubernetes.io/projected/ac2133e1-e19a-41ea-9a20-8653fb57c519-kube-api-access-w2nqk\") pod \"keystone-c495-account-create-92jpw\" (UID: \"ac2133e1-e19a-41ea-9a20-8653fb57c519\") " pod="openstack/keystone-c495-account-create-92jpw" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.185826 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2nqk\" (UniqueName: \"kubernetes.io/projected/ac2133e1-e19a-41ea-9a20-8653fb57c519-kube-api-access-w2nqk\") pod \"keystone-c495-account-create-92jpw\" (UID: \"ac2133e1-e19a-41ea-9a20-8653fb57c519\") " pod="openstack/keystone-c495-account-create-92jpw" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.421182 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c495-account-create-92jpw" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.552691 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 14:20:00 crc kubenswrapper[4962]: I1003 14:20:00.836987 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c495-account-create-92jpw"] Oct 03 14:20:01 crc kubenswrapper[4962]: I1003 14:20:01.293769 4962 generic.go:334] "Generic (PLEG): container finished" podID="ac2133e1-e19a-41ea-9a20-8653fb57c519" containerID="6679adbbce4585a7df247ce019ba5e41cc535f28d99e5ca0c55d071aa6c055c8" exitCode=0 Oct 03 14:20:01 crc kubenswrapper[4962]: I1003 14:20:01.293807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c495-account-create-92jpw" event={"ID":"ac2133e1-e19a-41ea-9a20-8653fb57c519","Type":"ContainerDied","Data":"6679adbbce4585a7df247ce019ba5e41cc535f28d99e5ca0c55d071aa6c055c8"} Oct 03 14:20:01 crc kubenswrapper[4962]: I1003 14:20:01.293828 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c495-account-create-92jpw" event={"ID":"ac2133e1-e19a-41ea-9a20-8653fb57c519","Type":"ContainerStarted","Data":"ad94f792980c2787b7499df54450003281c4a41201e79f4f8b4bba41fbe14433"} Oct 03 14:20:02 crc kubenswrapper[4962]: I1003 14:20:02.651874 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c495-account-create-92jpw" Oct 03 14:20:02 crc kubenswrapper[4962]: I1003 14:20:02.711515 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2nqk\" (UniqueName: \"kubernetes.io/projected/ac2133e1-e19a-41ea-9a20-8653fb57c519-kube-api-access-w2nqk\") pod \"ac2133e1-e19a-41ea-9a20-8653fb57c519\" (UID: \"ac2133e1-e19a-41ea-9a20-8653fb57c519\") " Oct 03 14:20:02 crc kubenswrapper[4962]: I1003 14:20:02.716429 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2133e1-e19a-41ea-9a20-8653fb57c519-kube-api-access-w2nqk" (OuterVolumeSpecName: "kube-api-access-w2nqk") pod "ac2133e1-e19a-41ea-9a20-8653fb57c519" (UID: "ac2133e1-e19a-41ea-9a20-8653fb57c519"). InnerVolumeSpecName "kube-api-access-w2nqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:02 crc kubenswrapper[4962]: I1003 14:20:02.813467 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2nqk\" (UniqueName: \"kubernetes.io/projected/ac2133e1-e19a-41ea-9a20-8653fb57c519-kube-api-access-w2nqk\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:03 crc kubenswrapper[4962]: I1003 14:20:03.308407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c495-account-create-92jpw" event={"ID":"ac2133e1-e19a-41ea-9a20-8653fb57c519","Type":"ContainerDied","Data":"ad94f792980c2787b7499df54450003281c4a41201e79f4f8b4bba41fbe14433"} Oct 03 14:20:03 crc kubenswrapper[4962]: I1003 14:20:03.308455 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad94f792980c2787b7499df54450003281c4a41201e79f4f8b4bba41fbe14433" Oct 03 14:20:03 crc kubenswrapper[4962]: I1003 14:20:03.308510 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c495-account-create-92jpw" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.518526 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bhvfh"] Oct 03 14:20:05 crc kubenswrapper[4962]: E1003 14:20:05.519155 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2133e1-e19a-41ea-9a20-8653fb57c519" containerName="mariadb-account-create" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.519167 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2133e1-e19a-41ea-9a20-8653fb57c519" containerName="mariadb-account-create" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.519309 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac2133e1-e19a-41ea-9a20-8653fb57c519" containerName="mariadb-account-create" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.519831 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.524751 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dgxx6" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.525009 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.525114 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.525217 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.532021 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bhvfh"] Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.561659 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2vd\" (UniqueName: \"kubernetes.io/projected/7544721a-dd02-4e36-8660-402cb244510e-kube-api-access-lt2vd\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.562054 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-config-data\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.562160 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-combined-ca-bundle\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.662774 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-config-data\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.662830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-combined-ca-bundle\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.662870 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2vd\" (UniqueName: \"kubernetes.io/projected/7544721a-dd02-4e36-8660-402cb244510e-kube-api-access-lt2vd\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.668182 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-combined-ca-bundle\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.670762 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-config-data\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.687928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2vd\" (UniqueName: \"kubernetes.io/projected/7544721a-dd02-4e36-8660-402cb244510e-kube-api-access-lt2vd\") pod \"keystone-db-sync-bhvfh\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:05 crc kubenswrapper[4962]: I1003 14:20:05.840755 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:06 crc kubenswrapper[4962]: I1003 14:20:06.296140 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bhvfh"] Oct 03 14:20:06 crc kubenswrapper[4962]: W1003 14:20:06.304954 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7544721a_dd02_4e36_8660_402cb244510e.slice/crio-5149277d1454f81cbf68a369439ed3f35278e78d9613af3525934e3fef5bfa61 WatchSource:0}: Error finding container 5149277d1454f81cbf68a369439ed3f35278e78d9613af3525934e3fef5bfa61: Status 404 returned error can't find the container with id 5149277d1454f81cbf68a369439ed3f35278e78d9613af3525934e3fef5bfa61 Oct 03 14:20:06 crc kubenswrapper[4962]: I1003 14:20:06.338007 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bhvfh" event={"ID":"7544721a-dd02-4e36-8660-402cb244510e","Type":"ContainerStarted","Data":"5149277d1454f81cbf68a369439ed3f35278e78d9613af3525934e3fef5bfa61"} Oct 03 14:20:07 crc kubenswrapper[4962]: I1003 14:20:07.345678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bhvfh" event={"ID":"7544721a-dd02-4e36-8660-402cb244510e","Type":"ContainerStarted","Data":"9691264e81c813dea2599d0b28e55c130f559b47eee52175143c7b1c43887bcf"} Oct 03 14:20:07 crc kubenswrapper[4962]: I1003 14:20:07.381819 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bhvfh" podStartSLOduration=2.381796067 podStartE2EDuration="2.381796067s" podCreationTimestamp="2025-10-03 14:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:07.374896797 +0000 UTC m=+5415.778794642" watchObservedRunningTime="2025-10-03 14:20:07.381796067 +0000 UTC m=+5415.785693912" Oct 03 14:20:08 crc kubenswrapper[4962]: I1003 14:20:08.353951 4962 generic.go:334] "Generic (PLEG): container finished" podID="7544721a-dd02-4e36-8660-402cb244510e" containerID="9691264e81c813dea2599d0b28e55c130f559b47eee52175143c7b1c43887bcf" exitCode=0 Oct 03 14:20:08 crc kubenswrapper[4962]: I1003 14:20:08.353991 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bhvfh" event={"ID":"7544721a-dd02-4e36-8660-402cb244510e","Type":"ContainerDied","Data":"9691264e81c813dea2599d0b28e55c130f559b47eee52175143c7b1c43887bcf"} Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.698336 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.726564 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-combined-ca-bundle\") pod \"7544721a-dd02-4e36-8660-402cb244510e\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.726716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt2vd\" (UniqueName: \"kubernetes.io/projected/7544721a-dd02-4e36-8660-402cb244510e-kube-api-access-lt2vd\") pod \"7544721a-dd02-4e36-8660-402cb244510e\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.726961 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-config-data\") pod \"7544721a-dd02-4e36-8660-402cb244510e\" (UID: \"7544721a-dd02-4e36-8660-402cb244510e\") " Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.734997 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7544721a-dd02-4e36-8660-402cb244510e-kube-api-access-lt2vd" (OuterVolumeSpecName: "kube-api-access-lt2vd") pod "7544721a-dd02-4e36-8660-402cb244510e" (UID: "7544721a-dd02-4e36-8660-402cb244510e"). InnerVolumeSpecName "kube-api-access-lt2vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.754957 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7544721a-dd02-4e36-8660-402cb244510e" (UID: "7544721a-dd02-4e36-8660-402cb244510e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.779100 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-config-data" (OuterVolumeSpecName: "config-data") pod "7544721a-dd02-4e36-8660-402cb244510e" (UID: "7544721a-dd02-4e36-8660-402cb244510e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.787684 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-26sq2"] Oct 03 14:20:09 crc kubenswrapper[4962]: E1003 14:20:09.788260 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7544721a-dd02-4e36-8660-402cb244510e" containerName="keystone-db-sync" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.788348 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7544721a-dd02-4e36-8660-402cb244510e" containerName="keystone-db-sync" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.788738 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7544721a-dd02-4e36-8660-402cb244510e" containerName="keystone-db-sync" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.790249 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.806861 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26sq2"] Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.829407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8gs\" (UniqueName: \"kubernetes.io/projected/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-kube-api-access-gh8gs\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.829617 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-utilities\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.829672 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-catalog-content\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.829841 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt2vd\" (UniqueName: \"kubernetes.io/projected/7544721a-dd02-4e36-8660-402cb244510e-kube-api-access-lt2vd\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.829858 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.829872 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7544721a-dd02-4e36-8660-402cb244510e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.931726 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8gs\" (UniqueName: \"kubernetes.io/projected/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-kube-api-access-gh8gs\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.932433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-utilities\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.933035 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-catalog-content\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.932983 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-utilities\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.933405 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-catalog-content\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:09 crc kubenswrapper[4962]: I1003 14:20:09.949483 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8gs\" (UniqueName: \"kubernetes.io/projected/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-kube-api-access-gh8gs\") pod \"redhat-marketplace-26sq2\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.139716 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.373402 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bhvfh" event={"ID":"7544721a-dd02-4e36-8660-402cb244510e","Type":"ContainerDied","Data":"5149277d1454f81cbf68a369439ed3f35278e78d9613af3525934e3fef5bfa61"} Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.373687 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5149277d1454f81cbf68a369439ed3f35278e78d9613af3525934e3fef5bfa61" Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.373802 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bhvfh" Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.567265 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26sq2"] Oct 03 14:20:10 crc kubenswrapper[4962]: W1003 14:20:10.574484 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd2b1ca_d49b_4981_a6d1_1ceb9135c125.slice/crio-78bb44f7d805af606957e3f9334b28c80af73f47e663ab67f06e34eac29055f5 WatchSource:0}: Error finding container 78bb44f7d805af606957e3f9334b28c80af73f47e663ab67f06e34eac29055f5: Status 404 returned error can't find the container with id 78bb44f7d805af606957e3f9334b28c80af73f47e663ab67f06e34eac29055f5 Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.978181 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78977f7cdf-6kqzb"] Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.983819 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.988396 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78977f7cdf-6kqzb"] Oct 03 14:20:10 crc kubenswrapper[4962]: I1003 14:20:10.998762 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6t8dq"] Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.001733 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.004826 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.007152 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dgxx6" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.007340 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.007612 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.025100 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6t8dq"] Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-scripts\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053483 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpskj\" (UniqueName: \"kubernetes.io/projected/18b9a747-b5c9-434d-93ef-ec99ad3da244-kube-api-access-jpskj\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053511 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-sb\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j8w5\" (UniqueName: \"kubernetes.io/projected/a576d489-e3d2-4414-9018-4639cf4b478e-kube-api-access-9j8w5\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053582 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-fernet-keys\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053603 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-credential-keys\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-dns-svc\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053661 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-combined-ca-bundle\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-nb\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053768 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-config-data\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.053793 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-config\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.154827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-nb\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.154888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-config-data\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.154923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-config\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.154956 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-scripts\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.155018 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpskj\" (UniqueName: \"kubernetes.io/projected/18b9a747-b5c9-434d-93ef-ec99ad3da244-kube-api-access-jpskj\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.155047 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-sb\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.155090 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j8w5\" (UniqueName: \"kubernetes.io/projected/a576d489-e3d2-4414-9018-4639cf4b478e-kube-api-access-9j8w5\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.155127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-fernet-keys\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.155145 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-credential-keys\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.155161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-dns-svc\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.155181 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-combined-ca-bundle\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.156713 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-dns-svc\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.156845 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-config\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.156872 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-sb\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.157065 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-nb\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.162740 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-fernet-keys\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.164694 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-credential-keys\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.167321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-combined-ca-bundle\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.186028 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-scripts\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.186891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-config-data\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.187803 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j8w5\" (UniqueName: \"kubernetes.io/projected/a576d489-e3d2-4414-9018-4639cf4b478e-kube-api-access-9j8w5\") pod \"keystone-bootstrap-6t8dq\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.191328 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpskj\" (UniqueName: \"kubernetes.io/projected/18b9a747-b5c9-434d-93ef-ec99ad3da244-kube-api-access-jpskj\") pod \"dnsmasq-dns-78977f7cdf-6kqzb\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.323698 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.340731 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.386743 4962 generic.go:334] "Generic (PLEG): container finished" podID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerID="2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552" exitCode=0 Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.386786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26sq2" event={"ID":"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125","Type":"ContainerDied","Data":"2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552"} Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.386813 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26sq2" event={"ID":"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125","Type":"ContainerStarted","Data":"78bb44f7d805af606957e3f9334b28c80af73f47e663ab67f06e34eac29055f5"} Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.388471 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.827359 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78977f7cdf-6kqzb"] Oct 03 14:20:11 crc kubenswrapper[4962]: W1003 14:20:11.831881 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda576d489_e3d2_4414_9018_4639cf4b478e.slice/crio-69932bd3aae89f6557c4b37d7de2238d9d04ad0341554d22c183af5b822d1c8d WatchSource:0}: Error finding container 69932bd3aae89f6557c4b37d7de2238d9d04ad0341554d22c183af5b822d1c8d: Status 404 returned error can't find the container with id 69932bd3aae89f6557c4b37d7de2238d9d04ad0341554d22c183af5b822d1c8d Oct 03 14:20:11 crc kubenswrapper[4962]: I1003 14:20:11.832780 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6t8dq"] Oct 03 14:20:11 crc kubenswrapper[4962]: W1003 14:20:11.837549 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b9a747_b5c9_434d_93ef_ec99ad3da244.slice/crio-1b6f1c853be17178b647bd5a2292772e149c0b34fef31bb1cc376e00292dcdae WatchSource:0}: Error finding container 1b6f1c853be17178b647bd5a2292772e149c0b34fef31bb1cc376e00292dcdae: Status 404 returned error can't find the container with id 1b6f1c853be17178b647bd5a2292772e149c0b34fef31bb1cc376e00292dcdae Oct 03 14:20:12 crc kubenswrapper[4962]: I1003 14:20:12.394297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6t8dq" event={"ID":"a576d489-e3d2-4414-9018-4639cf4b478e","Type":"ContainerStarted","Data":"2d3d3b215770e96653eccb58336c037e1515850ead95ae01bcff3479a9a85113"} Oct 03 14:20:12 crc kubenswrapper[4962]: I1003 14:20:12.394611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6t8dq" event={"ID":"a576d489-e3d2-4414-9018-4639cf4b478e","Type":"ContainerStarted","Data":"69932bd3aae89f6557c4b37d7de2238d9d04ad0341554d22c183af5b822d1c8d"} Oct 03 14:20:12 crc kubenswrapper[4962]: I1003 14:20:12.396899 4962 generic.go:334] "Generic (PLEG): container finished" podID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerID="3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d" exitCode=0 Oct 03 14:20:12 crc kubenswrapper[4962]: I1003 14:20:12.396948 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26sq2" event={"ID":"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125","Type":"ContainerDied","Data":"3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d"} Oct 03 14:20:12 crc kubenswrapper[4962]: I1003 14:20:12.400192 4962 generic.go:334] "Generic (PLEG): container finished" podID="18b9a747-b5c9-434d-93ef-ec99ad3da244" containerID="f1ee03f9aeadc08915244e702c734d5f282c46dd6b9229bf03a1ad1ad1317726" exitCode=0 Oct 03 14:20:12 crc kubenswrapper[4962]: I1003 14:20:12.400250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" event={"ID":"18b9a747-b5c9-434d-93ef-ec99ad3da244","Type":"ContainerDied","Data":"f1ee03f9aeadc08915244e702c734d5f282c46dd6b9229bf03a1ad1ad1317726"} Oct 03 14:20:12 crc kubenswrapper[4962]: I1003 14:20:12.400270 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" event={"ID":"18b9a747-b5c9-434d-93ef-ec99ad3da244","Type":"ContainerStarted","Data":"1b6f1c853be17178b647bd5a2292772e149c0b34fef31bb1cc376e00292dcdae"} Oct 03 14:20:12 crc kubenswrapper[4962]: I1003 14:20:12.420050 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6t8dq" podStartSLOduration=2.420024403 podStartE2EDuration="2.420024403s" podCreationTimestamp="2025-10-03 14:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:12.414493628 +0000 UTC m=+5420.818391483" watchObservedRunningTime="2025-10-03 14:20:12.420024403 +0000 UTC m=+5420.823922238" Oct 03 14:20:12 crc kubenswrapper[4962]: E1003 14:20:12.463591 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd2b1ca_d49b_4981_a6d1_1ceb9135c125.slice/crio-conmon-3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:20:13 crc kubenswrapper[4962]: I1003 14:20:13.425081 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26sq2" event={"ID":"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125","Type":"ContainerStarted","Data":"3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522"} Oct 03 14:20:13 crc kubenswrapper[4962]: I1003 14:20:13.427214 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" event={"ID":"18b9a747-b5c9-434d-93ef-ec99ad3da244","Type":"ContainerStarted","Data":"b0395cdc8d895044a72c25b3ffd359c77b2d298457ad7db73d98625d06105e9c"} Oct 03 14:20:13 crc kubenswrapper[4962]: I1003 14:20:13.427511 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:13 crc kubenswrapper[4962]: I1003 14:20:13.446392 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-26sq2" podStartSLOduration=2.852167616 podStartE2EDuration="4.446372239s" podCreationTimestamp="2025-10-03 14:20:09 +0000 UTC" firstStartedPulling="2025-10-03 14:20:11.388279966 +0000 UTC m=+5419.792177801" lastFinishedPulling="2025-10-03 14:20:12.982484589 +0000 UTC m=+5421.386382424" observedRunningTime="2025-10-03 14:20:13.443630017 +0000 UTC m=+5421.847527872" watchObservedRunningTime="2025-10-03 14:20:13.446372239 +0000 UTC m=+5421.850270074" Oct 03 14:20:13 crc kubenswrapper[4962]: I1003 14:20:13.467285 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" podStartSLOduration=3.467268285 podStartE2EDuration="3.467268285s" podCreationTimestamp="2025-10-03 14:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:13.462585473 +0000 UTC m=+5421.866483328" watchObservedRunningTime="2025-10-03 14:20:13.467268285 +0000 UTC m=+5421.871166110" Oct 03 14:20:15 crc kubenswrapper[4962]: I1003 14:20:15.444960 4962 generic.go:334] "Generic (PLEG): container finished" podID="a576d489-e3d2-4414-9018-4639cf4b478e" containerID="2d3d3b215770e96653eccb58336c037e1515850ead95ae01bcff3479a9a85113" exitCode=0 Oct 03 14:20:15 crc kubenswrapper[4962]: I1003 14:20:15.445056 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6t8dq" event={"ID":"a576d489-e3d2-4414-9018-4639cf4b478e","Type":"ContainerDied","Data":"2d3d3b215770e96653eccb58336c037e1515850ead95ae01bcff3479a9a85113"} Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.756390 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.864225 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-combined-ca-bundle\") pod \"a576d489-e3d2-4414-9018-4639cf4b478e\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.864294 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j8w5\" (UniqueName: \"kubernetes.io/projected/a576d489-e3d2-4414-9018-4639cf4b478e-kube-api-access-9j8w5\") pod \"a576d489-e3d2-4414-9018-4639cf4b478e\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.864376 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-config-data\") pod \"a576d489-e3d2-4414-9018-4639cf4b478e\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.864430 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-scripts\") pod \"a576d489-e3d2-4414-9018-4639cf4b478e\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.864455 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-credential-keys\") pod \"a576d489-e3d2-4414-9018-4639cf4b478e\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.864495 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-fernet-keys\") pod \"a576d489-e3d2-4414-9018-4639cf4b478e\" (UID: \"a576d489-e3d2-4414-9018-4639cf4b478e\") " Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.871117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a576d489-e3d2-4414-9018-4639cf4b478e-kube-api-access-9j8w5" (OuterVolumeSpecName: "kube-api-access-9j8w5") pod "a576d489-e3d2-4414-9018-4639cf4b478e" (UID: "a576d489-e3d2-4414-9018-4639cf4b478e"). InnerVolumeSpecName "kube-api-access-9j8w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.871137 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a576d489-e3d2-4414-9018-4639cf4b478e" (UID: "a576d489-e3d2-4414-9018-4639cf4b478e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.871366 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-scripts" (OuterVolumeSpecName: "scripts") pod "a576d489-e3d2-4414-9018-4639cf4b478e" (UID: "a576d489-e3d2-4414-9018-4639cf4b478e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.871815 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a576d489-e3d2-4414-9018-4639cf4b478e" (UID: "a576d489-e3d2-4414-9018-4639cf4b478e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.888113 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-config-data" (OuterVolumeSpecName: "config-data") pod "a576d489-e3d2-4414-9018-4639cf4b478e" (UID: "a576d489-e3d2-4414-9018-4639cf4b478e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.911191 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a576d489-e3d2-4414-9018-4639cf4b478e" (UID: "a576d489-e3d2-4414-9018-4639cf4b478e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.966802 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.966842 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.966851 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.966861 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.966869 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a576d489-e3d2-4414-9018-4639cf4b478e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:16 crc kubenswrapper[4962]: I1003 14:20:16.966877 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j8w5\" (UniqueName: \"kubernetes.io/projected/a576d489-e3d2-4414-9018-4639cf4b478e-kube-api-access-9j8w5\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.467778 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6t8dq" event={"ID":"a576d489-e3d2-4414-9018-4639cf4b478e","Type":"ContainerDied","Data":"69932bd3aae89f6557c4b37d7de2238d9d04ad0341554d22c183af5b822d1c8d"} Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.467831 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69932bd3aae89f6557c4b37d7de2238d9d04ad0341554d22c183af5b822d1c8d" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.467868 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6t8dq" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.540742 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6t8dq"] Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.548836 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6t8dq"] Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.629252 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gz4fx"] Oct 03 14:20:17 crc kubenswrapper[4962]: E1003 14:20:17.629785 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a576d489-e3d2-4414-9018-4639cf4b478e" containerName="keystone-bootstrap" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.629858 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a576d489-e3d2-4414-9018-4639cf4b478e" containerName="keystone-bootstrap" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.630107 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a576d489-e3d2-4414-9018-4639cf4b478e" containerName="keystone-bootstrap" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.630728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.634177 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.634892 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.635082 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dgxx6" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.635238 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.641447 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gz4fx"] Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.779798 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-fernet-keys\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.779895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-combined-ca-bundle\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.779934 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-scripts\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.779971 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkt4m\" (UniqueName: \"kubernetes.io/projected/a67e74ba-efcb-4f16-930f-57335376321f-kube-api-access-zkt4m\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.780018 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-credential-keys\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.780180 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-config-data\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.881851 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-config-data\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.882217 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-fernet-keys\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.882286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-combined-ca-bundle\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.882331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-scripts\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.882379 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkt4m\" (UniqueName: \"kubernetes.io/projected/a67e74ba-efcb-4f16-930f-57335376321f-kube-api-access-zkt4m\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.882429 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-credential-keys\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.887075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-combined-ca-bundle\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.888608 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-scripts\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.890326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-credential-keys\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.891179 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-fernet-keys\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.893565 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-config-data\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.900011 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkt4m\" (UniqueName: \"kubernetes.io/projected/a67e74ba-efcb-4f16-930f-57335376321f-kube-api-access-zkt4m\") pod \"keystone-bootstrap-gz4fx\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:17 crc kubenswrapper[4962]: I1003 14:20:17.950186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:18 crc kubenswrapper[4962]: I1003 14:20:18.236848 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a576d489-e3d2-4414-9018-4639cf4b478e" path="/var/lib/kubelet/pods/a576d489-e3d2-4414-9018-4639cf4b478e/volumes" Oct 03 14:20:18 crc kubenswrapper[4962]: I1003 14:20:18.413453 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gz4fx"] Oct 03 14:20:18 crc kubenswrapper[4962]: I1003 14:20:18.478001 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gz4fx" event={"ID":"a67e74ba-efcb-4f16-930f-57335376321f","Type":"ContainerStarted","Data":"fdd5a33f46a79a221477d53b0b130270d574f6461d531500494f46493383a491"} Oct 03 14:20:19 crc kubenswrapper[4962]: I1003 14:20:19.491508 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gz4fx" event={"ID":"a67e74ba-efcb-4f16-930f-57335376321f","Type":"ContainerStarted","Data":"b44a99512bc5d00d2cd67f149868b4f5c5f65bfc3c2413e6b9d655726c00be3d"} Oct 03 14:20:19 crc kubenswrapper[4962]: I1003 14:20:19.515364 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gz4fx" podStartSLOduration=2.515347266 podStartE2EDuration="2.515347266s" podCreationTimestamp="2025-10-03 14:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:19.510326705 +0000 UTC m=+5427.914224540" watchObservedRunningTime="2025-10-03 14:20:19.515347266 +0000 UTC m=+5427.919245101" Oct 03 14:20:20 crc kubenswrapper[4962]: I1003 14:20:20.140853 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:20 crc kubenswrapper[4962]: I1003 14:20:20.141425 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:20 crc kubenswrapper[4962]: I1003 14:20:20.180936 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:20 crc kubenswrapper[4962]: I1003 14:20:20.555092 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:20 crc kubenswrapper[4962]: I1003 14:20:20.601030 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26sq2"] Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.325883 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.384536 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c87b4597-2bjln"] Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.384874 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" podUID="de03c973-dd77-4560-975d-5bcc19732dc2" containerName="dnsmasq-dns" containerID="cri-o://0fca765c7236442e03c45c96d908ffafb0a7287b43564182027ae691cf43ec71" gracePeriod=10 Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.511364 4962 generic.go:334] "Generic (PLEG): container finished" podID="a67e74ba-efcb-4f16-930f-57335376321f" containerID="b44a99512bc5d00d2cd67f149868b4f5c5f65bfc3c2413e6b9d655726c00be3d" exitCode=0 Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.511425 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gz4fx" event={"ID":"a67e74ba-efcb-4f16-930f-57335376321f","Type":"ContainerDied","Data":"b44a99512bc5d00d2cd67f149868b4f5c5f65bfc3c2413e6b9d655726c00be3d"} Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.513422 4962 generic.go:334] "Generic (PLEG): container finished" podID="de03c973-dd77-4560-975d-5bcc19732dc2" containerID="0fca765c7236442e03c45c96d908ffafb0a7287b43564182027ae691cf43ec71" exitCode=0 Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.514045 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" event={"ID":"de03c973-dd77-4560-975d-5bcc19732dc2","Type":"ContainerDied","Data":"0fca765c7236442e03c45c96d908ffafb0a7287b43564182027ae691cf43ec71"} Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.800985 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.978714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgrrb\" (UniqueName: \"kubernetes.io/projected/de03c973-dd77-4560-975d-5bcc19732dc2-kube-api-access-wgrrb\") pod \"de03c973-dd77-4560-975d-5bcc19732dc2\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.978818 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-config\") pod \"de03c973-dd77-4560-975d-5bcc19732dc2\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.978966 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-dns-svc\") pod \"de03c973-dd77-4560-975d-5bcc19732dc2\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.979036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-sb\") pod \"de03c973-dd77-4560-975d-5bcc19732dc2\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.979097 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-nb\") pod \"de03c973-dd77-4560-975d-5bcc19732dc2\" (UID: \"de03c973-dd77-4560-975d-5bcc19732dc2\") " Oct 03 14:20:21 crc kubenswrapper[4962]: I1003 14:20:21.987974 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de03c973-dd77-4560-975d-5bcc19732dc2-kube-api-access-wgrrb" (OuterVolumeSpecName: "kube-api-access-wgrrb") pod "de03c973-dd77-4560-975d-5bcc19732dc2" (UID: "de03c973-dd77-4560-975d-5bcc19732dc2"). InnerVolumeSpecName "kube-api-access-wgrrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.026224 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de03c973-dd77-4560-975d-5bcc19732dc2" (UID: "de03c973-dd77-4560-975d-5bcc19732dc2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.034266 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de03c973-dd77-4560-975d-5bcc19732dc2" (UID: "de03c973-dd77-4560-975d-5bcc19732dc2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.034652 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-config" (OuterVolumeSpecName: "config") pod "de03c973-dd77-4560-975d-5bcc19732dc2" (UID: "de03c973-dd77-4560-975d-5bcc19732dc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.041921 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "de03c973-dd77-4560-975d-5bcc19732dc2" (UID: "de03c973-dd77-4560-975d-5bcc19732dc2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.080834 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.080863 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.080872 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgrrb\" (UniqueName: \"kubernetes.io/projected/de03c973-dd77-4560-975d-5bcc19732dc2-kube-api-access-wgrrb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.080882 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.080892 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de03c973-dd77-4560-975d-5bcc19732dc2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.523245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" event={"ID":"de03c973-dd77-4560-975d-5bcc19732dc2","Type":"ContainerDied","Data":"c34cc3dbfa37846d82a501699a7d0c4c02a485ec3e365268c4f1ea7972ade419"} Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.523311 4962 scope.go:117] "RemoveContainer" containerID="0fca765c7236442e03c45c96d908ffafb0a7287b43564182027ae691cf43ec71" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.523409 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c87b4597-2bjln" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.523501 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-26sq2" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerName="registry-server" containerID="cri-o://3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522" gracePeriod=2 Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.577792 4962 scope.go:117] "RemoveContainer" containerID="777822b7a05c100043161800e0ec18b922130878bff5a8e640f5f851919eefd6" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.580557 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c87b4597-2bjln"] Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.596550 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76c87b4597-2bjln"] Oct 03 14:20:22 crc kubenswrapper[4962]: E1003 14:20:22.759923 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd2b1ca_d49b_4981_a6d1_1ceb9135c125.slice/crio-conmon-3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd2b1ca_d49b_4981_a6d1_1ceb9135c125.slice/crio-3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.827216 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.937775 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.997719 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-credential-keys\") pod \"a67e74ba-efcb-4f16-930f-57335376321f\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.997796 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-fernet-keys\") pod \"a67e74ba-efcb-4f16-930f-57335376321f\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.997859 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-config-data\") pod \"a67e74ba-efcb-4f16-930f-57335376321f\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.997974 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-combined-ca-bundle\") pod \"a67e74ba-efcb-4f16-930f-57335376321f\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.998055 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkt4m\" (UniqueName: \"kubernetes.io/projected/a67e74ba-efcb-4f16-930f-57335376321f-kube-api-access-zkt4m\") pod \"a67e74ba-efcb-4f16-930f-57335376321f\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " Oct 03 14:20:22 crc kubenswrapper[4962]: I1003 14:20:22.998093 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-scripts\") pod \"a67e74ba-efcb-4f16-930f-57335376321f\" (UID: \"a67e74ba-efcb-4f16-930f-57335376321f\") " Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.003286 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-scripts" (OuterVolumeSpecName: "scripts") pod "a67e74ba-efcb-4f16-930f-57335376321f" (UID: "a67e74ba-efcb-4f16-930f-57335376321f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.004009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a67e74ba-efcb-4f16-930f-57335376321f" (UID: "a67e74ba-efcb-4f16-930f-57335376321f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.004852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a67e74ba-efcb-4f16-930f-57335376321f-kube-api-access-zkt4m" (OuterVolumeSpecName: "kube-api-access-zkt4m") pod "a67e74ba-efcb-4f16-930f-57335376321f" (UID: "a67e74ba-efcb-4f16-930f-57335376321f"). InnerVolumeSpecName "kube-api-access-zkt4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.006233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a67e74ba-efcb-4f16-930f-57335376321f" (UID: "a67e74ba-efcb-4f16-930f-57335376321f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.027310 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a67e74ba-efcb-4f16-930f-57335376321f" (UID: "a67e74ba-efcb-4f16-930f-57335376321f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.031814 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-config-data" (OuterVolumeSpecName: "config-data") pod "a67e74ba-efcb-4f16-930f-57335376321f" (UID: "a67e74ba-efcb-4f16-930f-57335376321f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099176 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh8gs\" (UniqueName: \"kubernetes.io/projected/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-kube-api-access-gh8gs\") pod \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099232 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-utilities\") pod \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099321 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-catalog-content\") pod \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\" (UID: \"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125\") " Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099593 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkt4m\" (UniqueName: \"kubernetes.io/projected/a67e74ba-efcb-4f16-930f-57335376321f-kube-api-access-zkt4m\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099609 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099619 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099627 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099659 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.099675 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67e74ba-efcb-4f16-930f-57335376321f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.100271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-utilities" (OuterVolumeSpecName: "utilities") pod "7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" (UID: "7bd2b1ca-d49b-4981-a6d1-1ceb9135c125"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.102338 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-kube-api-access-gh8gs" (OuterVolumeSpecName: "kube-api-access-gh8gs") pod "7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" (UID: "7bd2b1ca-d49b-4981-a6d1-1ceb9135c125"). InnerVolumeSpecName "kube-api-access-gh8gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.111647 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" (UID: "7bd2b1ca-d49b-4981-a6d1-1ceb9135c125"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.201992 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh8gs\" (UniqueName: \"kubernetes.io/projected/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-kube-api-access-gh8gs\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.202062 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.202091 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.537015 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gz4fx" event={"ID":"a67e74ba-efcb-4f16-930f-57335376321f","Type":"ContainerDied","Data":"fdd5a33f46a79a221477d53b0b130270d574f6461d531500494f46493383a491"} Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.537259 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd5a33f46a79a221477d53b0b130270d574f6461d531500494f46493383a491" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.537077 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gz4fx" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.540516 4962 generic.go:334] "Generic (PLEG): container finished" podID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerID="3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522" exitCode=0 Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.540574 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26sq2" event={"ID":"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125","Type":"ContainerDied","Data":"3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522"} Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.540602 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26sq2" event={"ID":"7bd2b1ca-d49b-4981-a6d1-1ceb9135c125","Type":"ContainerDied","Data":"78bb44f7d805af606957e3f9334b28c80af73f47e663ab67f06e34eac29055f5"} Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.540661 4962 scope.go:117] "RemoveContainer" containerID="3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.540758 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26sq2" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.573682 4962 scope.go:117] "RemoveContainer" containerID="3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.597273 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26sq2"] Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.615687 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-26sq2"] Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.624965 4962 scope.go:117] "RemoveContainer" containerID="2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.630839 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5bcc869749-z5gsq"] Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.631280 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerName="extract-utilities" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631307 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerName="extract-utilities" Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.631325 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerName="registry-server" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631336 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerName="registry-server" Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.631351 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de03c973-dd77-4560-975d-5bcc19732dc2" containerName="dnsmasq-dns" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631362 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="de03c973-dd77-4560-975d-5bcc19732dc2" containerName="dnsmasq-dns" Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.631386 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerName="extract-content" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631397 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerName="extract-content" Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.631424 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de03c973-dd77-4560-975d-5bcc19732dc2" containerName="init" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631434 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="de03c973-dd77-4560-975d-5bcc19732dc2" containerName="init" Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.631446 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67e74ba-efcb-4f16-930f-57335376321f" containerName="keystone-bootstrap" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631457 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67e74ba-efcb-4f16-930f-57335376321f" containerName="keystone-bootstrap" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631744 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a67e74ba-efcb-4f16-930f-57335376321f" containerName="keystone-bootstrap" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631773 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" containerName="registry-server" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.631807 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="de03c973-dd77-4560-975d-5bcc19732dc2" containerName="dnsmasq-dns" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.632615 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.636099 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.636378 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dgxx6" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.636581 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.636801 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.645226 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bcc869749-z5gsq"] Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.681062 4962 scope.go:117] "RemoveContainer" containerID="3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522" Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.681439 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522\": container with ID starting with 3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522 not found: ID does not exist" containerID="3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.681468 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522"} err="failed to get container status \"3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522\": rpc error: code = NotFound desc = could not find container \"3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522\": container with ID starting with 3b6d980609a2d40cd6c413ed2783db02c1adff3063035faeba1074b342f03522 not found: ID does not exist" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.681490 4962 scope.go:117] "RemoveContainer" containerID="3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d" Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.681951 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d\": container with ID starting with 3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d not found: ID does not exist" containerID="3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.681968 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d"} err="failed to get container status \"3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d\": rpc error: code = NotFound desc = could not find container \"3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d\": container with ID starting with 3a619077f6474f708048b10a07217a3f611ce4181ea5d341a6710e7afa59265d not found: ID does not exist" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.681987 4962 scope.go:117] "RemoveContainer" containerID="2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552" Oct 03 14:20:23 crc kubenswrapper[4962]: E1003 14:20:23.682335 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552\": container with ID starting with 2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552 not found: ID does not exist" containerID="2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.682353 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552"} err="failed to get container status \"2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552\": rpc error: code = NotFound desc = could not find container \"2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552\": container with ID starting with 2d472b857a5b6add06b5fdcc17440e1ee0c55b3f5d23bc00381ceeed20a60552 not found: ID does not exist" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.709862 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-scripts\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.709903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-credential-keys\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.709926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-fernet-keys\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.709985 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbrwm\" (UniqueName: \"kubernetes.io/projected/4d25d167-3bc9-436d-86ad-36da4b9a2a88-kube-api-access-fbrwm\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.710088 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-combined-ca-bundle\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.710131 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-config-data\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.812976 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-combined-ca-bundle\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.813055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-config-data\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.813266 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-scripts\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.813324 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-credential-keys\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.813354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-fernet-keys\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.813382 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbrwm\" (UniqueName: \"kubernetes.io/projected/4d25d167-3bc9-436d-86ad-36da4b9a2a88-kube-api-access-fbrwm\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.816885 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-credential-keys\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.817028 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-combined-ca-bundle\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.818306 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-fernet-keys\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.818428 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-config-data\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.818825 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d25d167-3bc9-436d-86ad-36da4b9a2a88-scripts\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:23 crc kubenswrapper[4962]: I1003 14:20:23.834592 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbrwm\" (UniqueName: \"kubernetes.io/projected/4d25d167-3bc9-436d-86ad-36da4b9a2a88-kube-api-access-fbrwm\") pod \"keystone-5bcc869749-z5gsq\" (UID: \"4d25d167-3bc9-436d-86ad-36da4b9a2a88\") " pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:24 crc kubenswrapper[4962]: I1003 14:20:24.005725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:24 crc kubenswrapper[4962]: I1003 14:20:24.239984 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd2b1ca-d49b-4981-a6d1-1ceb9135c125" path="/var/lib/kubelet/pods/7bd2b1ca-d49b-4981-a6d1-1ceb9135c125/volumes" Oct 03 14:20:24 crc kubenswrapper[4962]: I1003 14:20:24.241978 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de03c973-dd77-4560-975d-5bcc19732dc2" path="/var/lib/kubelet/pods/de03c973-dd77-4560-975d-5bcc19732dc2/volumes" Oct 03 14:20:24 crc kubenswrapper[4962]: I1003 14:20:24.455249 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bcc869749-z5gsq"] Oct 03 14:20:24 crc kubenswrapper[4962]: I1003 14:20:24.554726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bcc869749-z5gsq" event={"ID":"4d25d167-3bc9-436d-86ad-36da4b9a2a88","Type":"ContainerStarted","Data":"db263483711a566c49adb95b0020e695c458f7001c268ddffcb9b182cca632e1"} Oct 03 14:20:25 crc kubenswrapper[4962]: I1003 14:20:25.562989 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bcc869749-z5gsq" event={"ID":"4d25d167-3bc9-436d-86ad-36da4b9a2a88","Type":"ContainerStarted","Data":"816757220a36738c4fc550941c9b319ecba17ddadc682fd328b529833219100d"} Oct 03 14:20:25 crc kubenswrapper[4962]: I1003 14:20:25.563709 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:20:25 crc kubenswrapper[4962]: I1003 14:20:25.589134 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5bcc869749-z5gsq" podStartSLOduration=2.589108031 podStartE2EDuration="2.589108031s" podCreationTimestamp="2025-10-03 14:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:25.582893469 +0000 UTC m=+5433.986791304" watchObservedRunningTime="2025-10-03 14:20:25.589108031 +0000 UTC m=+5433.993005866" Oct 03 14:20:55 crc kubenswrapper[4962]: I1003 14:20:55.610524 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5bcc869749-z5gsq" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.077143 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.079033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.080552 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-p49fp" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.081463 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.081467 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.086087 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.228083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jpqk\" (UniqueName: \"kubernetes.io/projected/c583a81c-2241-4f3f-a190-2b90cff0b4db-kube-api-access-6jpqk\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.228158 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config-secret\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.228275 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.330414 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.331115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.331601 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jpqk\" (UniqueName: \"kubernetes.io/projected/c583a81c-2241-4f3f-a190-2b90cff0b4db-kube-api-access-6jpqk\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.331902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config-secret\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.338103 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config-secret\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.348027 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jpqk\" (UniqueName: \"kubernetes.io/projected/c583a81c-2241-4f3f-a190-2b90cff0b4db-kube-api-access-6jpqk\") pod \"openstackclient\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.401344 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.580973 4962 scope.go:117] "RemoveContainer" containerID="34d9749e8d5764e8be5c836d09d86b0f52d5e5ef5899e8f79cdcf248d3592323" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.616263 4962 scope.go:117] "RemoveContainer" containerID="3761e0efd9c845d897100c81fefe7c049bf5a5027429c42e04585b63b5a307c9" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.636989 4962 scope.go:117] "RemoveContainer" containerID="b6787005a97e750f7a69212a26ea8171fd15215b2955a70357c0652f5a897ca5" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.655363 4962 scope.go:117] "RemoveContainer" containerID="b08a8f93526e17aab47e01efa43d0c3e6556618901e5d77bd7871b90c85a0c76" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.671406 4962 scope.go:117] "RemoveContainer" containerID="df926cb0d567cee684d7129fdbd9a8927c475803ed36a5e239cf7dca6366f36a" Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.810799 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 14:21:00 crc kubenswrapper[4962]: W1003 14:21:00.816857 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc583a81c_2241_4f3f_a190_2b90cff0b4db.slice/crio-7a5c88ee8385d136b58000b09a7a620c8fdd56d1ad775d26a94630826724439a WatchSource:0}: Error finding container 7a5c88ee8385d136b58000b09a7a620c8fdd56d1ad775d26a94630826724439a: Status 404 returned error can't find the container with id 7a5c88ee8385d136b58000b09a7a620c8fdd56d1ad775d26a94630826724439a Oct 03 14:21:00 crc kubenswrapper[4962]: I1003 14:21:00.842941 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c583a81c-2241-4f3f-a190-2b90cff0b4db","Type":"ContainerStarted","Data":"7a5c88ee8385d136b58000b09a7a620c8fdd56d1ad775d26a94630826724439a"} Oct 03 14:21:01 crc kubenswrapper[4962]: I1003 14:21:01.851820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c583a81c-2241-4f3f-a190-2b90cff0b4db","Type":"ContainerStarted","Data":"71817b2098a855f7b56344bd1399519c0ace5645fa5371c1eda128acb86a9434"} Oct 03 14:21:01 crc kubenswrapper[4962]: I1003 14:21:01.872304 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.872283782 podStartE2EDuration="1.872283782s" podCreationTimestamp="2025-10-03 14:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:21:01.866774018 +0000 UTC m=+5470.270671873" watchObservedRunningTime="2025-10-03 14:21:01.872283782 +0000 UTC m=+5470.276181607" Oct 03 14:21:54 crc kubenswrapper[4962]: I1003 14:21:54.661558 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:21:54 crc kubenswrapper[4962]: I1003 14:21:54.662603 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:22:15 crc kubenswrapper[4962]: E1003 14:22:15.534030 4962 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.172:52952->38.129.56.172:43051: write tcp 38.129.56.172:52952->38.129.56.172:43051: write: broken pipe Oct 03 14:22:24 crc kubenswrapper[4962]: I1003 14:22:24.659851 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:22:24 crc kubenswrapper[4962]: I1003 14:22:24.660327 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.193371 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b6q9v"] Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.195893 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.204127 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6q9v"] Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.349438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-utilities\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.349534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-catalog-content\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.349587 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9brf\" (UniqueName: \"kubernetes.io/projected/aa6a62f4-1822-46d4-94a5-dcf599004ce3-kube-api-access-g9brf\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.450888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-utilities\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.450987 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-catalog-content\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.451034 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9brf\" (UniqueName: \"kubernetes.io/projected/aa6a62f4-1822-46d4-94a5-dcf599004ce3-kube-api-access-g9brf\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.451494 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-utilities\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.451612 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-catalog-content\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.470119 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9brf\" (UniqueName: \"kubernetes.io/projected/aa6a62f4-1822-46d4-94a5-dcf599004ce3-kube-api-access-g9brf\") pod \"redhat-operators-b6q9v\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:31 crc kubenswrapper[4962]: I1003 14:22:31.518842 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:32 crc kubenswrapper[4962]: I1003 14:22:32.062093 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6q9v"] Oct 03 14:22:32 crc kubenswrapper[4962]: I1003 14:22:32.638842 4962 generic.go:334] "Generic (PLEG): container finished" podID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerID="b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f" exitCode=0 Oct 03 14:22:32 crc kubenswrapper[4962]: I1003 14:22:32.638908 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6q9v" event={"ID":"aa6a62f4-1822-46d4-94a5-dcf599004ce3","Type":"ContainerDied","Data":"b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f"} Oct 03 14:22:32 crc kubenswrapper[4962]: I1003 14:22:32.639789 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6q9v" event={"ID":"aa6a62f4-1822-46d4-94a5-dcf599004ce3","Type":"ContainerStarted","Data":"88f058d315635e93a254d49f1dd2eeba4e683f96e92888f95f40843e266696d6"} Oct 03 14:22:34 crc kubenswrapper[4962]: I1003 14:22:34.665114 4962 generic.go:334] "Generic (PLEG): container finished" podID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerID="6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38" exitCode=0 Oct 03 14:22:34 crc kubenswrapper[4962]: I1003 14:22:34.665188 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6q9v" event={"ID":"aa6a62f4-1822-46d4-94a5-dcf599004ce3","Type":"ContainerDied","Data":"6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38"} Oct 03 14:22:35 crc kubenswrapper[4962]: I1003 14:22:35.676335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6q9v" event={"ID":"aa6a62f4-1822-46d4-94a5-dcf599004ce3","Type":"ContainerStarted","Data":"13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f"} Oct 03 14:22:35 crc kubenswrapper[4962]: I1003 14:22:35.697216 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b6q9v" podStartSLOduration=2.265171053 podStartE2EDuration="4.697195394s" podCreationTimestamp="2025-10-03 14:22:31 +0000 UTC" firstStartedPulling="2025-10-03 14:22:32.640458559 +0000 UTC m=+5561.044356404" lastFinishedPulling="2025-10-03 14:22:35.07248289 +0000 UTC m=+5563.476380745" observedRunningTime="2025-10-03 14:22:35.696592158 +0000 UTC m=+5564.100489993" watchObservedRunningTime="2025-10-03 14:22:35.697195394 +0000 UTC m=+5564.101093229" Oct 03 14:22:41 crc kubenswrapper[4962]: I1003 14:22:41.519164 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:41 crc kubenswrapper[4962]: I1003 14:22:41.519803 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:41 crc kubenswrapper[4962]: I1003 14:22:41.558402 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:41 crc kubenswrapper[4962]: I1003 14:22:41.754582 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:41 crc kubenswrapper[4962]: I1003 14:22:41.794459 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6q9v"] Oct 03 14:22:43 crc kubenswrapper[4962]: I1003 14:22:43.545725 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-zxh9l"] Oct 03 14:22:43 crc kubenswrapper[4962]: I1003 14:22:43.546915 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zxh9l" Oct 03 14:22:43 crc kubenswrapper[4962]: I1003 14:22:43.555766 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zxh9l"] Oct 03 14:22:43 crc kubenswrapper[4962]: I1003 14:22:43.663209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqpsh\" (UniqueName: \"kubernetes.io/projected/2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f-kube-api-access-kqpsh\") pod \"barbican-db-create-zxh9l\" (UID: \"2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f\") " pod="openstack/barbican-db-create-zxh9l" Oct 03 14:22:43 crc kubenswrapper[4962]: I1003 14:22:43.729155 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b6q9v" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerName="registry-server" containerID="cri-o://13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f" gracePeriod=2 Oct 03 14:22:43 crc kubenswrapper[4962]: I1003 14:22:43.765523 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqpsh\" (UniqueName: \"kubernetes.io/projected/2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f-kube-api-access-kqpsh\") pod \"barbican-db-create-zxh9l\" (UID: \"2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f\") " pod="openstack/barbican-db-create-zxh9l" Oct 03 14:22:43 crc kubenswrapper[4962]: I1003 14:22:43.782921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqpsh\" (UniqueName: \"kubernetes.io/projected/2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f-kube-api-access-kqpsh\") pod \"barbican-db-create-zxh9l\" (UID: \"2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f\") " pod="openstack/barbican-db-create-zxh9l" Oct 03 14:22:43 crc kubenswrapper[4962]: I1003 14:22:43.864722 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zxh9l" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.151773 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.181660 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-utilities\") pod \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.181774 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-catalog-content\") pod \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.181803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9brf\" (UniqueName: \"kubernetes.io/projected/aa6a62f4-1822-46d4-94a5-dcf599004ce3-kube-api-access-g9brf\") pod \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\" (UID: \"aa6a62f4-1822-46d4-94a5-dcf599004ce3\") " Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.188051 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6a62f4-1822-46d4-94a5-dcf599004ce3-kube-api-access-g9brf" (OuterVolumeSpecName: "kube-api-access-g9brf") pod "aa6a62f4-1822-46d4-94a5-dcf599004ce3" (UID: "aa6a62f4-1822-46d4-94a5-dcf599004ce3"). InnerVolumeSpecName "kube-api-access-g9brf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.190269 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-utilities" (OuterVolumeSpecName: "utilities") pod "aa6a62f4-1822-46d4-94a5-dcf599004ce3" (UID: "aa6a62f4-1822-46d4-94a5-dcf599004ce3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.283221 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.283256 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9brf\" (UniqueName: \"kubernetes.io/projected/aa6a62f4-1822-46d4-94a5-dcf599004ce3-kube-api-access-g9brf\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.290182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa6a62f4-1822-46d4-94a5-dcf599004ce3" (UID: "aa6a62f4-1822-46d4-94a5-dcf599004ce3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.315574 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zxh9l"] Oct 03 14:22:44 crc kubenswrapper[4962]: W1003 14:22:44.325907 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f87af9f_c72a_4c57_8d18_4e8a38eb9a9f.slice/crio-88448984d6ff2bf3bc9d9d35d245aa1b283c1bc43bc5c26643bd087febb5ed7a WatchSource:0}: Error finding container 88448984d6ff2bf3bc9d9d35d245aa1b283c1bc43bc5c26643bd087febb5ed7a: Status 404 returned error can't find the container with id 88448984d6ff2bf3bc9d9d35d245aa1b283c1bc43bc5c26643bd087febb5ed7a Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.384304 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa6a62f4-1822-46d4-94a5-dcf599004ce3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.752718 4962 generic.go:334] "Generic (PLEG): container finished" podID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerID="13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f" exitCode=0 Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.752923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6q9v" event={"ID":"aa6a62f4-1822-46d4-94a5-dcf599004ce3","Type":"ContainerDied","Data":"13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f"} Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.753081 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6q9v" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.753844 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6q9v" event={"ID":"aa6a62f4-1822-46d4-94a5-dcf599004ce3","Type":"ContainerDied","Data":"88f058d315635e93a254d49f1dd2eeba4e683f96e92888f95f40843e266696d6"} Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.753928 4962 scope.go:117] "RemoveContainer" containerID="13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.758163 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f" containerID="2150916b7e19a8427c2ec04ae45725eb127e65c44b56ebfe67b4e9eef7b9e6f3" exitCode=0 Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.758207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zxh9l" event={"ID":"2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f","Type":"ContainerDied","Data":"2150916b7e19a8427c2ec04ae45725eb127e65c44b56ebfe67b4e9eef7b9e6f3"} Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.758235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zxh9l" event={"ID":"2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f","Type":"ContainerStarted","Data":"88448984d6ff2bf3bc9d9d35d245aa1b283c1bc43bc5c26643bd087febb5ed7a"} Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.777143 4962 scope.go:117] "RemoveContainer" containerID="6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.796137 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6q9v"] Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.800929 4962 scope.go:117] "RemoveContainer" containerID="b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.805906 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b6q9v"] Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.816803 4962 scope.go:117] "RemoveContainer" containerID="13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f" Oct 03 14:22:44 crc kubenswrapper[4962]: E1003 14:22:44.817231 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f\": container with ID starting with 13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f not found: ID does not exist" containerID="13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.817271 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f"} err="failed to get container status \"13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f\": rpc error: code = NotFound desc = could not find container \"13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f\": container with ID starting with 13c5f442ccdc3a856adea851daca10d0fda2406a7f677a48830755393503c65f not found: ID does not exist" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.817296 4962 scope.go:117] "RemoveContainer" containerID="6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38" Oct 03 14:22:44 crc kubenswrapper[4962]: E1003 14:22:44.817512 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38\": container with ID starting with 6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38 not found: ID does not exist" containerID="6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.817535 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38"} err="failed to get container status \"6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38\": rpc error: code = NotFound desc = could not find container \"6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38\": container with ID starting with 6568c0a0fba0bcb4e6da60c665aaeca2c7d4848069df54f0ba5a1b76f0f90f38 not found: ID does not exist" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.817547 4962 scope.go:117] "RemoveContainer" containerID="b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f" Oct 03 14:22:44 crc kubenswrapper[4962]: E1003 14:22:44.817808 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f\": container with ID starting with b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f not found: ID does not exist" containerID="b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f" Oct 03 14:22:44 crc kubenswrapper[4962]: I1003 14:22:44.817839 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f"} err="failed to get container status \"b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f\": rpc error: code = NotFound desc = could not find container \"b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f\": container with ID starting with b3703ac65d644d2e43995b397ca36a87675f4b6a7085de6026fbfece2f47a34f not found: ID does not exist" Oct 03 14:22:46 crc kubenswrapper[4962]: I1003 14:22:46.091741 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zxh9l" Oct 03 14:22:46 crc kubenswrapper[4962]: I1003 14:22:46.109885 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqpsh\" (UniqueName: \"kubernetes.io/projected/2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f-kube-api-access-kqpsh\") pod \"2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f\" (UID: \"2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f\") " Oct 03 14:22:46 crc kubenswrapper[4962]: I1003 14:22:46.115423 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f-kube-api-access-kqpsh" (OuterVolumeSpecName: "kube-api-access-kqpsh") pod "2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f" (UID: "2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f"). InnerVolumeSpecName "kube-api-access-kqpsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:46 crc kubenswrapper[4962]: I1003 14:22:46.211324 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqpsh\" (UniqueName: \"kubernetes.io/projected/2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f-kube-api-access-kqpsh\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:46 crc kubenswrapper[4962]: I1003 14:22:46.237255 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" path="/var/lib/kubelet/pods/aa6a62f4-1822-46d4-94a5-dcf599004ce3/volumes" Oct 03 14:22:46 crc kubenswrapper[4962]: I1003 14:22:46.780789 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zxh9l" event={"ID":"2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f","Type":"ContainerDied","Data":"88448984d6ff2bf3bc9d9d35d245aa1b283c1bc43bc5c26643bd087febb5ed7a"} Oct 03 14:22:46 crc kubenswrapper[4962]: I1003 14:22:46.780832 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88448984d6ff2bf3bc9d9d35d245aa1b283c1bc43bc5c26643bd087febb5ed7a" Oct 03 14:22:46 crc kubenswrapper[4962]: I1003 14:22:46.780883 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zxh9l" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.661502 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6456-account-create-6t7dd"] Oct 03 14:22:53 crc kubenswrapper[4962]: E1003 14:22:53.662950 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f" containerName="mariadb-database-create" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.662972 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f" containerName="mariadb-database-create" Oct 03 14:22:53 crc kubenswrapper[4962]: E1003 14:22:53.663005 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerName="extract-content" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.663016 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerName="extract-content" Oct 03 14:22:53 crc kubenswrapper[4962]: E1003 14:22:53.663039 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerName="extract-utilities" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.663049 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerName="extract-utilities" Oct 03 14:22:53 crc kubenswrapper[4962]: E1003 14:22:53.663076 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerName="registry-server" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.663086 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerName="registry-server" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.663456 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6a62f4-1822-46d4-94a5-dcf599004ce3" containerName="registry-server" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.663484 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f" containerName="mariadb-database-create" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.667799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6456-account-create-6t7dd" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.670111 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.683037 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6456-account-create-6t7dd"] Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.836501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46h6\" (UniqueName: \"kubernetes.io/projected/b17774ad-1948-4212-bf5a-84c3a1b4b771-kube-api-access-j46h6\") pod \"barbican-6456-account-create-6t7dd\" (UID: \"b17774ad-1948-4212-bf5a-84c3a1b4b771\") " pod="openstack/barbican-6456-account-create-6t7dd" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.939376 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46h6\" (UniqueName: \"kubernetes.io/projected/b17774ad-1948-4212-bf5a-84c3a1b4b771-kube-api-access-j46h6\") pod \"barbican-6456-account-create-6t7dd\" (UID: \"b17774ad-1948-4212-bf5a-84c3a1b4b771\") " pod="openstack/barbican-6456-account-create-6t7dd" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.958273 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46h6\" (UniqueName: \"kubernetes.io/projected/b17774ad-1948-4212-bf5a-84c3a1b4b771-kube-api-access-j46h6\") pod \"barbican-6456-account-create-6t7dd\" (UID: \"b17774ad-1948-4212-bf5a-84c3a1b4b771\") " pod="openstack/barbican-6456-account-create-6t7dd" Oct 03 14:22:53 crc kubenswrapper[4962]: I1003 14:22:53.997472 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6456-account-create-6t7dd" Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.416978 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6456-account-create-6t7dd"] Oct 03 14:22:54 crc kubenswrapper[4962]: W1003 14:22:54.427873 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb17774ad_1948_4212_bf5a_84c3a1b4b771.slice/crio-171509ac33c40145c5b1d8f0588fa34fb424b3cfae24c4d3acf36fa7fa2381a2 WatchSource:0}: Error finding container 171509ac33c40145c5b1d8f0588fa34fb424b3cfae24c4d3acf36fa7fa2381a2: Status 404 returned error can't find the container with id 171509ac33c40145c5b1d8f0588fa34fb424b3cfae24c4d3acf36fa7fa2381a2 Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.659835 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.659888 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.659939 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.860668 4962 generic.go:334] "Generic (PLEG): container finished" podID="b17774ad-1948-4212-bf5a-84c3a1b4b771" containerID="82046b4c930316d9215712bf450a721d4933b2325ca212a1135f291351147f12" exitCode=0 Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.860710 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6456-account-create-6t7dd" event={"ID":"b17774ad-1948-4212-bf5a-84c3a1b4b771","Type":"ContainerDied","Data":"82046b4c930316d9215712bf450a721d4933b2325ca212a1135f291351147f12"} Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.860826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6456-account-create-6t7dd" event={"ID":"b17774ad-1948-4212-bf5a-84c3a1b4b771","Type":"ContainerStarted","Data":"171509ac33c40145c5b1d8f0588fa34fb424b3cfae24c4d3acf36fa7fa2381a2"} Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.861501 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66e15d378b389fff3661c27f240d5e7dda70a518dfc3f92072db88df889fc781"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:22:54 crc kubenswrapper[4962]: I1003 14:22:54.861572 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://66e15d378b389fff3661c27f240d5e7dda70a518dfc3f92072db88df889fc781" gracePeriod=600 Oct 03 14:22:55 crc kubenswrapper[4962]: I1003 14:22:55.874054 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="66e15d378b389fff3661c27f240d5e7dda70a518dfc3f92072db88df889fc781" exitCode=0 Oct 03 14:22:55 crc kubenswrapper[4962]: I1003 14:22:55.875436 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"66e15d378b389fff3661c27f240d5e7dda70a518dfc3f92072db88df889fc781"} Oct 03 14:22:55 crc kubenswrapper[4962]: I1003 14:22:55.875491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db"} Oct 03 14:22:55 crc kubenswrapper[4962]: I1003 14:22:55.875517 4962 scope.go:117] "RemoveContainer" containerID="a52df5deff87c27e42ab4abdf36cdf07c37d1bccfe3bbd74005bdf3b8177d5cc" Oct 03 14:22:56 crc kubenswrapper[4962]: I1003 14:22:56.206464 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6456-account-create-6t7dd" Oct 03 14:22:56 crc kubenswrapper[4962]: I1003 14:22:56.377161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j46h6\" (UniqueName: \"kubernetes.io/projected/b17774ad-1948-4212-bf5a-84c3a1b4b771-kube-api-access-j46h6\") pod \"b17774ad-1948-4212-bf5a-84c3a1b4b771\" (UID: \"b17774ad-1948-4212-bf5a-84c3a1b4b771\") " Oct 03 14:22:56 crc kubenswrapper[4962]: I1003 14:22:56.383584 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b17774ad-1948-4212-bf5a-84c3a1b4b771-kube-api-access-j46h6" (OuterVolumeSpecName: "kube-api-access-j46h6") pod "b17774ad-1948-4212-bf5a-84c3a1b4b771" (UID: "b17774ad-1948-4212-bf5a-84c3a1b4b771"). InnerVolumeSpecName "kube-api-access-j46h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:56 crc kubenswrapper[4962]: I1003 14:22:56.480816 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j46h6\" (UniqueName: \"kubernetes.io/projected/b17774ad-1948-4212-bf5a-84c3a1b4b771-kube-api-access-j46h6\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:56 crc kubenswrapper[4962]: I1003 14:22:56.889601 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6456-account-create-6t7dd" event={"ID":"b17774ad-1948-4212-bf5a-84c3a1b4b771","Type":"ContainerDied","Data":"171509ac33c40145c5b1d8f0588fa34fb424b3cfae24c4d3acf36fa7fa2381a2"} Oct 03 14:22:56 crc kubenswrapper[4962]: I1003 14:22:56.889894 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="171509ac33c40145c5b1d8f0588fa34fb424b3cfae24c4d3acf36fa7fa2381a2" Oct 03 14:22:56 crc kubenswrapper[4962]: I1003 14:22:56.889939 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6456-account-create-6t7dd" Oct 03 14:22:58 crc kubenswrapper[4962]: I1003 14:22:58.936824 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-t6pd5"] Oct 03 14:22:58 crc kubenswrapper[4962]: E1003 14:22:58.937752 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17774ad-1948-4212-bf5a-84c3a1b4b771" containerName="mariadb-account-create" Oct 03 14:22:58 crc kubenswrapper[4962]: I1003 14:22:58.937768 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17774ad-1948-4212-bf5a-84c3a1b4b771" containerName="mariadb-account-create" Oct 03 14:22:58 crc kubenswrapper[4962]: I1003 14:22:58.937977 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b17774ad-1948-4212-bf5a-84c3a1b4b771" containerName="mariadb-account-create" Oct 03 14:22:58 crc kubenswrapper[4962]: I1003 14:22:58.938868 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:58 crc kubenswrapper[4962]: I1003 14:22:58.943007 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 14:22:58 crc kubenswrapper[4962]: I1003 14:22:58.943058 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6nkfw" Oct 03 14:22:58 crc kubenswrapper[4962]: I1003 14:22:58.950531 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t6pd5"] Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.028319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-db-sync-config-data\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.028376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-combined-ca-bundle\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.028400 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjm6l\" (UniqueName: \"kubernetes.io/projected/3d71ba1f-f85a-4de7-8a9c-c86903b92300-kube-api-access-vjm6l\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.129598 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-db-sync-config-data\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.129688 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-combined-ca-bundle\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.129712 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjm6l\" (UniqueName: \"kubernetes.io/projected/3d71ba1f-f85a-4de7-8a9c-c86903b92300-kube-api-access-vjm6l\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.138272 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-combined-ca-bundle\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.146135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-db-sync-config-data\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.184697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjm6l\" (UniqueName: \"kubernetes.io/projected/3d71ba1f-f85a-4de7-8a9c-c86903b92300-kube-api-access-vjm6l\") pod \"barbican-db-sync-t6pd5\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.300233 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.759970 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t6pd5"] Oct 03 14:22:59 crc kubenswrapper[4962]: W1003 14:22:59.764468 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d71ba1f_f85a_4de7_8a9c_c86903b92300.slice/crio-d5abf98718176265d8458080d5639ebd8f56a6fdc1d505b51635176965c1feb5 WatchSource:0}: Error finding container d5abf98718176265d8458080d5639ebd8f56a6fdc1d505b51635176965c1feb5: Status 404 returned error can't find the container with id d5abf98718176265d8458080d5639ebd8f56a6fdc1d505b51635176965c1feb5 Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.929746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t6pd5" event={"ID":"3d71ba1f-f85a-4de7-8a9c-c86903b92300","Type":"ContainerStarted","Data":"176e2bc990a8132aa13de7a6f3100c02d4e785cf990da2661d22a4c7d11b34f0"} Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.930087 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t6pd5" event={"ID":"3d71ba1f-f85a-4de7-8a9c-c86903b92300","Type":"ContainerStarted","Data":"d5abf98718176265d8458080d5639ebd8f56a6fdc1d505b51635176965c1feb5"} Oct 03 14:22:59 crc kubenswrapper[4962]: I1003 14:22:59.943551 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-t6pd5" podStartSLOduration=1.9435344620000001 podStartE2EDuration="1.943534462s" podCreationTimestamp="2025-10-03 14:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:59.942959427 +0000 UTC m=+5588.346857262" watchObservedRunningTime="2025-10-03 14:22:59.943534462 +0000 UTC m=+5588.347432297" Oct 03 14:23:01 crc kubenswrapper[4962]: I1003 14:23:01.946929 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d71ba1f-f85a-4de7-8a9c-c86903b92300" containerID="176e2bc990a8132aa13de7a6f3100c02d4e785cf990da2661d22a4c7d11b34f0" exitCode=0 Oct 03 14:23:01 crc kubenswrapper[4962]: I1003 14:23:01.946980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t6pd5" event={"ID":"3d71ba1f-f85a-4de7-8a9c-c86903b92300","Type":"ContainerDied","Data":"176e2bc990a8132aa13de7a6f3100c02d4e785cf990da2661d22a4c7d11b34f0"} Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.294146 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.397739 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-db-sync-config-data\") pod \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.397837 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-combined-ca-bundle\") pod \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.397962 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjm6l\" (UniqueName: \"kubernetes.io/projected/3d71ba1f-f85a-4de7-8a9c-c86903b92300-kube-api-access-vjm6l\") pod \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\" (UID: \"3d71ba1f-f85a-4de7-8a9c-c86903b92300\") " Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.402862 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3d71ba1f-f85a-4de7-8a9c-c86903b92300" (UID: "3d71ba1f-f85a-4de7-8a9c-c86903b92300"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.402997 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d71ba1f-f85a-4de7-8a9c-c86903b92300-kube-api-access-vjm6l" (OuterVolumeSpecName: "kube-api-access-vjm6l") pod "3d71ba1f-f85a-4de7-8a9c-c86903b92300" (UID: "3d71ba1f-f85a-4de7-8a9c-c86903b92300"). InnerVolumeSpecName "kube-api-access-vjm6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.418304 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d71ba1f-f85a-4de7-8a9c-c86903b92300" (UID: "3d71ba1f-f85a-4de7-8a9c-c86903b92300"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.500586 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.500668 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjm6l\" (UniqueName: \"kubernetes.io/projected/3d71ba1f-f85a-4de7-8a9c-c86903b92300-kube-api-access-vjm6l\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:03 crc kubenswrapper[4962]: I1003 14:23:03.500697 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d71ba1f-f85a-4de7-8a9c-c86903b92300-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.000479 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t6pd5" event={"ID":"3d71ba1f-f85a-4de7-8a9c-c86903b92300","Type":"ContainerDied","Data":"d5abf98718176265d8458080d5639ebd8f56a6fdc1d505b51635176965c1feb5"} Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.000813 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5abf98718176265d8458080d5639ebd8f56a6fdc1d505b51635176965c1feb5" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.000547 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t6pd5" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.190181 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-798f8c695f-slvjr"] Oct 03 14:23:04 crc kubenswrapper[4962]: E1003 14:23:04.190578 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d71ba1f-f85a-4de7-8a9c-c86903b92300" containerName="barbican-db-sync" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.190604 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d71ba1f-f85a-4de7-8a9c-c86903b92300" containerName="barbican-db-sync" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.195409 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d71ba1f-f85a-4de7-8a9c-c86903b92300" containerName="barbican-db-sync" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.196585 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.201422 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.201594 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6nkfw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.201726 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.213563 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-config-data-custom\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.213676 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pz2b\" (UniqueName: \"kubernetes.io/projected/7084aec7-12fb-401b-b866-4066ebe0e546-kube-api-access-7pz2b\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.213734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7084aec7-12fb-401b-b866-4066ebe0e546-logs\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.213758 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-config-data\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.213828 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-combined-ca-bundle\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.223587 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-798f8c695f-slvjr"] Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.254847 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-56b6f686d6-9ntzw"] Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.263536 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56b6f686d6-9ntzw"] Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.263629 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.268769 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.315826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqr6\" (UniqueName: \"kubernetes.io/projected/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-kube-api-access-xbqr6\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.315904 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pz2b\" (UniqueName: \"kubernetes.io/projected/7084aec7-12fb-401b-b866-4066ebe0e546-kube-api-access-7pz2b\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.315964 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-config-data-custom\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.316012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7084aec7-12fb-401b-b866-4066ebe0e546-logs\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.316037 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-config-data\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.316067 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-combined-ca-bundle\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.316139 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-logs\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.316166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-combined-ca-bundle\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.316225 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-config-data\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.316269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-config-data-custom\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.318102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7084aec7-12fb-401b-b866-4066ebe0e546-logs\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.325944 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-config-data-custom\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.328729 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-combined-ca-bundle\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.328798 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6874dd89cc-gzw44"] Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.331451 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.344194 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6874dd89cc-gzw44"] Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.346370 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pz2b\" (UniqueName: \"kubernetes.io/projected/7084aec7-12fb-401b-b866-4066ebe0e546-kube-api-access-7pz2b\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.361760 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7084aec7-12fb-401b-b866-4066ebe0e546-config-data\") pod \"barbican-worker-798f8c695f-slvjr\" (UID: \"7084aec7-12fb-401b-b866-4066ebe0e546\") " pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.417704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-config\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.417757 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqws9\" (UniqueName: \"kubernetes.io/projected/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-kube-api-access-cqws9\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.417791 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-nb\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.417833 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-combined-ca-bundle\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.417910 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-logs\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.417941 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-sb\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.418001 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-config-data\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.418046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-dns-svc\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.418086 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqr6\" (UniqueName: \"kubernetes.io/projected/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-kube-api-access-xbqr6\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.418162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-config-data-custom\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.420072 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-logs\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.423013 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6dfcdc79bb-5d7lt"] Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.423387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-config-data\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.423026 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-config-data-custom\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.424372 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.426224 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-combined-ca-bundle\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.428460 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.433461 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dfcdc79bb-5d7lt"] Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.441488 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqr6\" (UniqueName: \"kubernetes.io/projected/cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf-kube-api-access-xbqr6\") pod \"barbican-keystone-listener-56b6f686d6-9ntzw\" (UID: \"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf\") " pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.519501 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-config\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.519614 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqws9\" (UniqueName: \"kubernetes.io/projected/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-kube-api-access-cqws9\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520409 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-config\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-nb\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdtr7\" (UniqueName: \"kubernetes.io/projected/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-kube-api-access-pdtr7\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520507 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-config-data\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520533 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-config-data-custom\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520574 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-combined-ca-bundle\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520595 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-sb\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520648 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-logs\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.520686 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-dns-svc\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.521241 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-nb\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.521445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-sb\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.521482 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-dns-svc\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.536321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqws9\" (UniqueName: \"kubernetes.io/projected/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-kube-api-access-cqws9\") pod \"dnsmasq-dns-6874dd89cc-gzw44\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.553076 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-798f8c695f-slvjr" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.600811 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.622809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-config-data-custom\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.622893 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-combined-ca-bundle\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.622950 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-logs\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.623061 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdtr7\" (UniqueName: \"kubernetes.io/projected/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-kube-api-access-pdtr7\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.623088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-config-data\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.623828 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-logs\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.626933 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-config-data\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.627026 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-combined-ca-bundle\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.627469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-config-data-custom\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.646122 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdtr7\" (UniqueName: \"kubernetes.io/projected/bb24c6d5-37e0-46ab-9a4b-1a19f3c77110-kube-api-access-pdtr7\") pod \"barbican-api-6dfcdc79bb-5d7lt\" (UID: \"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110\") " pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.790191 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:04 crc kubenswrapper[4962]: I1003 14:23:04.800727 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:05 crc kubenswrapper[4962]: I1003 14:23:05.011409 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-798f8c695f-slvjr"] Oct 03 14:23:05 crc kubenswrapper[4962]: I1003 14:23:05.053772 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6874dd89cc-gzw44"] Oct 03 14:23:05 crc kubenswrapper[4962]: I1003 14:23:05.103531 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56b6f686d6-9ntzw"] Oct 03 14:23:05 crc kubenswrapper[4962]: W1003 14:23:05.112531 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf3b7d67_1f8c_48f6_b2bb_d5652a5129cf.slice/crio-b2e3df56bfa2aec34a3665bd1cab4deb9ad9e199a0233d9f2cbe24713a342c47 WatchSource:0}: Error finding container b2e3df56bfa2aec34a3665bd1cab4deb9ad9e199a0233d9f2cbe24713a342c47: Status 404 returned error can't find the container with id b2e3df56bfa2aec34a3665bd1cab4deb9ad9e199a0233d9f2cbe24713a342c47 Oct 03 14:23:05 crc kubenswrapper[4962]: I1003 14:23:05.346630 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dfcdc79bb-5d7lt"] Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.025774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" event={"ID":"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110","Type":"ContainerStarted","Data":"a6b29d6746ba06a0847d624bbe7a0a305460e24c1c3a87698b7f9df9b2386794"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.026129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" event={"ID":"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110","Type":"ContainerStarted","Data":"52afc5079674f63488db2476fede3ceb73822ce6e56b66854c12a6510af9706e"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.026139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" event={"ID":"bb24c6d5-37e0-46ab-9a4b-1a19f3c77110","Type":"ContainerStarted","Data":"3d284ecc1b1d3f70aa0dd3e0c6feefe34485599ef2a551465212844bba80a783"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.026185 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.026207 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.028272 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" event={"ID":"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf","Type":"ContainerStarted","Data":"a618a5fd30464633e3620addede31f885ee5c2c7369e5ba129daf41e5f432f9d"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.028344 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" event={"ID":"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf","Type":"ContainerStarted","Data":"5b541227f703e3c8203d75831595add678f8d32af24e20c389f12258e2d05e80"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.028358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" event={"ID":"cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf","Type":"ContainerStarted","Data":"b2e3df56bfa2aec34a3665bd1cab4deb9ad9e199a0233d9f2cbe24713a342c47"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.033588 4962 generic.go:334] "Generic (PLEG): container finished" podID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" containerID="d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2" exitCode=0 Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.033689 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" event={"ID":"8d3f07b8-8ccc-42e1-a570-e9924e18d67a","Type":"ContainerDied","Data":"d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.033727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" event={"ID":"8d3f07b8-8ccc-42e1-a570-e9924e18d67a","Type":"ContainerStarted","Data":"a383b7260d000de32474d993725678155052f2ead3609f6ed0ca3587a59d351f"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.036209 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-798f8c695f-slvjr" event={"ID":"7084aec7-12fb-401b-b866-4066ebe0e546","Type":"ContainerStarted","Data":"d28f7433d4e80ea226014d694c75c6e926fe1fadc1108b26462ec0d2295719b0"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.036294 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-798f8c695f-slvjr" event={"ID":"7084aec7-12fb-401b-b866-4066ebe0e546","Type":"ContainerStarted","Data":"6b4b3a8ff4d209eee3174f1e3f5d1f653209309af8c85b46b55261ecd54829b0"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.036319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-798f8c695f-slvjr" event={"ID":"7084aec7-12fb-401b-b866-4066ebe0e546","Type":"ContainerStarted","Data":"339e2efcbcf580ad2d3975438cce364b2c0c85d3b5fcd96cb7f788bc92feea24"} Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.074484 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" podStartSLOduration=2.074462919 podStartE2EDuration="2.074462919s" podCreationTimestamp="2025-10-03 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:06.059881267 +0000 UTC m=+5594.463779172" watchObservedRunningTime="2025-10-03 14:23:06.074462919 +0000 UTC m=+5594.478360744" Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.135656 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-56b6f686d6-9ntzw" podStartSLOduration=2.135619888 podStartE2EDuration="2.135619888s" podCreationTimestamp="2025-10-03 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:06.117238937 +0000 UTC m=+5594.521136772" watchObservedRunningTime="2025-10-03 14:23:06.135619888 +0000 UTC m=+5594.539517723" Oct 03 14:23:06 crc kubenswrapper[4962]: I1003 14:23:06.143497 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-798f8c695f-slvjr" podStartSLOduration=2.143477033 podStartE2EDuration="2.143477033s" podCreationTimestamp="2025-10-03 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:06.141301866 +0000 UTC m=+5594.545199711" watchObservedRunningTime="2025-10-03 14:23:06.143477033 +0000 UTC m=+5594.547374868" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.048720 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" event={"ID":"8d3f07b8-8ccc-42e1-a570-e9924e18d67a","Type":"ContainerStarted","Data":"dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2"} Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.049774 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.074506 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" podStartSLOduration=3.074487677 podStartE2EDuration="3.074487677s" podCreationTimestamp="2025-10-03 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:07.065573914 +0000 UTC m=+5595.469471759" watchObservedRunningTime="2025-10-03 14:23:07.074487677 +0000 UTC m=+5595.478385512" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.197129 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ngfs2"] Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.198943 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.218292 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngfs2"] Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.261824 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-catalog-content\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.261923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-utilities\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.262017 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxlw\" (UniqueName: \"kubernetes.io/projected/5add17b4-4bbc-4bec-9373-9646e2402fe9-kube-api-access-nxxlw\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.363287 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-catalog-content\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.363366 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-utilities\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.363447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxlw\" (UniqueName: \"kubernetes.io/projected/5add17b4-4bbc-4bec-9373-9646e2402fe9-kube-api-access-nxxlw\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.363771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-catalog-content\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.364436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-utilities\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.382453 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxlw\" (UniqueName: \"kubernetes.io/projected/5add17b4-4bbc-4bec-9373-9646e2402fe9-kube-api-access-nxxlw\") pod \"certified-operators-ngfs2\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.518683 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:07 crc kubenswrapper[4962]: I1003 14:23:07.967184 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngfs2"] Oct 03 14:23:08 crc kubenswrapper[4962]: I1003 14:23:08.064895 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfs2" event={"ID":"5add17b4-4bbc-4bec-9373-9646e2402fe9","Type":"ContainerStarted","Data":"ad900e0f23ff8a1530718c75534f33d03e341277ace946833eec9566b5ab89e3"} Oct 03 14:23:09 crc kubenswrapper[4962]: I1003 14:23:09.084305 4962 generic.go:334] "Generic (PLEG): container finished" podID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerID="663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320" exitCode=0 Oct 03 14:23:09 crc kubenswrapper[4962]: I1003 14:23:09.085187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfs2" event={"ID":"5add17b4-4bbc-4bec-9373-9646e2402fe9","Type":"ContainerDied","Data":"663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320"} Oct 03 14:23:10 crc kubenswrapper[4962]: I1003 14:23:10.097531 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfs2" event={"ID":"5add17b4-4bbc-4bec-9373-9646e2402fe9","Type":"ContainerStarted","Data":"cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7"} Oct 03 14:23:11 crc kubenswrapper[4962]: I1003 14:23:11.112896 4962 generic.go:334] "Generic (PLEG): container finished" podID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerID="cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7" exitCode=0 Oct 03 14:23:11 crc kubenswrapper[4962]: I1003 14:23:11.112955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfs2" event={"ID":"5add17b4-4bbc-4bec-9373-9646e2402fe9","Type":"ContainerDied","Data":"cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7"} Oct 03 14:23:12 crc kubenswrapper[4962]: I1003 14:23:12.124150 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfs2" event={"ID":"5add17b4-4bbc-4bec-9373-9646e2402fe9","Type":"ContainerStarted","Data":"a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66"} Oct 03 14:23:12 crc kubenswrapper[4962]: I1003 14:23:12.149378 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ngfs2" podStartSLOduration=2.562334847 podStartE2EDuration="5.149358231s" podCreationTimestamp="2025-10-03 14:23:07 +0000 UTC" firstStartedPulling="2025-10-03 14:23:09.089014631 +0000 UTC m=+5597.492912466" lastFinishedPulling="2025-10-03 14:23:11.676037985 +0000 UTC m=+5600.079935850" observedRunningTime="2025-10-03 14:23:12.142733768 +0000 UTC m=+5600.546631623" watchObservedRunningTime="2025-10-03 14:23:12.149358231 +0000 UTC m=+5600.553256076" Oct 03 14:23:14 crc kubenswrapper[4962]: I1003 14:23:14.793750 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:23:14 crc kubenswrapper[4962]: I1003 14:23:14.865783 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78977f7cdf-6kqzb"] Oct 03 14:23:14 crc kubenswrapper[4962]: I1003 14:23:14.866194 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" podUID="18b9a747-b5c9-434d-93ef-ec99ad3da244" containerName="dnsmasq-dns" containerID="cri-o://b0395cdc8d895044a72c25b3ffd359c77b2d298457ad7db73d98625d06105e9c" gracePeriod=10 Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.175199 4962 generic.go:334] "Generic (PLEG): container finished" podID="18b9a747-b5c9-434d-93ef-ec99ad3da244" containerID="b0395cdc8d895044a72c25b3ffd359c77b2d298457ad7db73d98625d06105e9c" exitCode=0 Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.175258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" event={"ID":"18b9a747-b5c9-434d-93ef-ec99ad3da244","Type":"ContainerDied","Data":"b0395cdc8d895044a72c25b3ffd359c77b2d298457ad7db73d98625d06105e9c"} Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.340324 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.421797 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-dns-svc\") pod \"18b9a747-b5c9-434d-93ef-ec99ad3da244\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.421891 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpskj\" (UniqueName: \"kubernetes.io/projected/18b9a747-b5c9-434d-93ef-ec99ad3da244-kube-api-access-jpskj\") pod \"18b9a747-b5c9-434d-93ef-ec99ad3da244\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.422203 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-sb\") pod \"18b9a747-b5c9-434d-93ef-ec99ad3da244\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.422426 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-nb\") pod \"18b9a747-b5c9-434d-93ef-ec99ad3da244\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.422473 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-config\") pod \"18b9a747-b5c9-434d-93ef-ec99ad3da244\" (UID: \"18b9a747-b5c9-434d-93ef-ec99ad3da244\") " Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.434830 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b9a747-b5c9-434d-93ef-ec99ad3da244-kube-api-access-jpskj" (OuterVolumeSpecName: "kube-api-access-jpskj") pod "18b9a747-b5c9-434d-93ef-ec99ad3da244" (UID: "18b9a747-b5c9-434d-93ef-ec99ad3da244"). InnerVolumeSpecName "kube-api-access-jpskj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.464927 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-config" (OuterVolumeSpecName: "config") pod "18b9a747-b5c9-434d-93ef-ec99ad3da244" (UID: "18b9a747-b5c9-434d-93ef-ec99ad3da244"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.465841 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18b9a747-b5c9-434d-93ef-ec99ad3da244" (UID: "18b9a747-b5c9-434d-93ef-ec99ad3da244"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.484599 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18b9a747-b5c9-434d-93ef-ec99ad3da244" (UID: "18b9a747-b5c9-434d-93ef-ec99ad3da244"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.486043 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18b9a747-b5c9-434d-93ef-ec99ad3da244" (UID: "18b9a747-b5c9-434d-93ef-ec99ad3da244"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.525817 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.525869 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.525883 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.525896 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b9a747-b5c9-434d-93ef-ec99ad3da244-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:15 crc kubenswrapper[4962]: I1003 14:23:15.525909 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpskj\" (UniqueName: \"kubernetes.io/projected/18b9a747-b5c9-434d-93ef-ec99ad3da244-kube-api-access-jpskj\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.183630 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" event={"ID":"18b9a747-b5c9-434d-93ef-ec99ad3da244","Type":"ContainerDied","Data":"1b6f1c853be17178b647bd5a2292772e149c0b34fef31bb1cc376e00292dcdae"} Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.183684 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78977f7cdf-6kqzb" Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.183726 4962 scope.go:117] "RemoveContainer" containerID="b0395cdc8d895044a72c25b3ffd359c77b2d298457ad7db73d98625d06105e9c" Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.220145 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78977f7cdf-6kqzb"] Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.226321 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78977f7cdf-6kqzb"] Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.235950 4962 scope.go:117] "RemoveContainer" containerID="f1ee03f9aeadc08915244e702c734d5f282c46dd6b9229bf03a1ad1ad1317726" Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.254409 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b9a747-b5c9-434d-93ef-ec99ad3da244" path="/var/lib/kubelet/pods/18b9a747-b5c9-434d-93ef-ec99ad3da244/volumes" Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.330754 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:16 crc kubenswrapper[4962]: I1003 14:23:16.372017 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dfcdc79bb-5d7lt" Oct 03 14:23:17 crc kubenswrapper[4962]: I1003 14:23:17.519286 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:17 crc kubenswrapper[4962]: I1003 14:23:17.520006 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:17 crc kubenswrapper[4962]: I1003 14:23:17.576596 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:18 crc kubenswrapper[4962]: I1003 14:23:18.280147 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:18 crc kubenswrapper[4962]: I1003 14:23:18.327015 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngfs2"] Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.222138 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ngfs2" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerName="registry-server" containerID="cri-o://a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66" gracePeriod=2 Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.771993 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.927473 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-utilities\") pod \"5add17b4-4bbc-4bec-9373-9646e2402fe9\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.927592 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-catalog-content\") pod \"5add17b4-4bbc-4bec-9373-9646e2402fe9\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.927666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxxlw\" (UniqueName: \"kubernetes.io/projected/5add17b4-4bbc-4bec-9373-9646e2402fe9-kube-api-access-nxxlw\") pod \"5add17b4-4bbc-4bec-9373-9646e2402fe9\" (UID: \"5add17b4-4bbc-4bec-9373-9646e2402fe9\") " Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.928340 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-utilities" (OuterVolumeSpecName: "utilities") pod "5add17b4-4bbc-4bec-9373-9646e2402fe9" (UID: "5add17b4-4bbc-4bec-9373-9646e2402fe9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.929023 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.941885 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5add17b4-4bbc-4bec-9373-9646e2402fe9-kube-api-access-nxxlw" (OuterVolumeSpecName: "kube-api-access-nxxlw") pod "5add17b4-4bbc-4bec-9373-9646e2402fe9" (UID: "5add17b4-4bbc-4bec-9373-9646e2402fe9"). InnerVolumeSpecName "kube-api-access-nxxlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:20 crc kubenswrapper[4962]: I1003 14:23:20.995065 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5add17b4-4bbc-4bec-9373-9646e2402fe9" (UID: "5add17b4-4bbc-4bec-9373-9646e2402fe9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.030911 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5add17b4-4bbc-4bec-9373-9646e2402fe9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.030956 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxxlw\" (UniqueName: \"kubernetes.io/projected/5add17b4-4bbc-4bec-9373-9646e2402fe9-kube-api-access-nxxlw\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.231717 4962 generic.go:334] "Generic (PLEG): container finished" podID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerID="a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66" exitCode=0 Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.231763 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfs2" event={"ID":"5add17b4-4bbc-4bec-9373-9646e2402fe9","Type":"ContainerDied","Data":"a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66"} Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.231801 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfs2" event={"ID":"5add17b4-4bbc-4bec-9373-9646e2402fe9","Type":"ContainerDied","Data":"ad900e0f23ff8a1530718c75534f33d03e341277ace946833eec9566b5ab89e3"} Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.231819 4962 scope.go:117] "RemoveContainer" containerID="a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.231841 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngfs2" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.261138 4962 scope.go:117] "RemoveContainer" containerID="cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.272055 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngfs2"] Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.283377 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ngfs2"] Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.301667 4962 scope.go:117] "RemoveContainer" containerID="663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.331439 4962 scope.go:117] "RemoveContainer" containerID="a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66" Oct 03 14:23:21 crc kubenswrapper[4962]: E1003 14:23:21.332012 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66\": container with ID starting with a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66 not found: ID does not exist" containerID="a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.332048 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66"} err="failed to get container status \"a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66\": rpc error: code = NotFound desc = could not find container \"a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66\": container with ID starting with a661d19d344dfa275f29c94c6a9b5d889adbba415e7d36fa99184d813908cb66 not found: ID does not exist" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.332074 4962 scope.go:117] "RemoveContainer" containerID="cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7" Oct 03 14:23:21 crc kubenswrapper[4962]: E1003 14:23:21.332538 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7\": container with ID starting with cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7 not found: ID does not exist" containerID="cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.332561 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7"} err="failed to get container status \"cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7\": rpc error: code = NotFound desc = could not find container \"cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7\": container with ID starting with cfe0d0d38de4ca335cd91aaccd5ee339440adfea2b97bc4f41ac4b6a0368c5c7 not found: ID does not exist" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.332580 4962 scope.go:117] "RemoveContainer" containerID="663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320" Oct 03 14:23:21 crc kubenswrapper[4962]: E1003 14:23:21.333173 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320\": container with ID starting with 663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320 not found: ID does not exist" containerID="663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320" Oct 03 14:23:21 crc kubenswrapper[4962]: I1003 14:23:21.333217 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320"} err="failed to get container status \"663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320\": rpc error: code = NotFound desc = could not find container \"663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320\": container with ID starting with 663ddcb9e02ad7ea9cc1ecf92ac9383a21a3c4227c942273ef97f6ae07ff2320 not found: ID does not exist" Oct 03 14:23:22 crc kubenswrapper[4962]: I1003 14:23:22.243655 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" path="/var/lib/kubelet/pods/5add17b4-4bbc-4bec-9373-9646e2402fe9/volumes" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.001529 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rbcsx"] Oct 03 14:23:30 crc kubenswrapper[4962]: E1003 14:23:30.002538 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b9a747-b5c9-434d-93ef-ec99ad3da244" containerName="init" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.002553 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b9a747-b5c9-434d-93ef-ec99ad3da244" containerName="init" Oct 03 14:23:30 crc kubenswrapper[4962]: E1003 14:23:30.002560 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b9a747-b5c9-434d-93ef-ec99ad3da244" containerName="dnsmasq-dns" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.002566 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b9a747-b5c9-434d-93ef-ec99ad3da244" containerName="dnsmasq-dns" Oct 03 14:23:30 crc kubenswrapper[4962]: E1003 14:23:30.002585 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerName="extract-content" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.002591 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerName="extract-content" Oct 03 14:23:30 crc kubenswrapper[4962]: E1003 14:23:30.002605 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerName="extract-utilities" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.002611 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerName="extract-utilities" Oct 03 14:23:30 crc kubenswrapper[4962]: E1003 14:23:30.002627 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerName="registry-server" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.002649 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerName="registry-server" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.002797 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5add17b4-4bbc-4bec-9373-9646e2402fe9" containerName="registry-server" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.002822 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b9a747-b5c9-434d-93ef-ec99ad3da244" containerName="dnsmasq-dns" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.003629 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbcsx" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.007617 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rbcsx"] Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.078924 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v45zf\" (UniqueName: \"kubernetes.io/projected/089bba13-506d-4b03-8b69-46e49102539a-kube-api-access-v45zf\") pod \"neutron-db-create-rbcsx\" (UID: \"089bba13-506d-4b03-8b69-46e49102539a\") " pod="openstack/neutron-db-create-rbcsx" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.181545 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v45zf\" (UniqueName: \"kubernetes.io/projected/089bba13-506d-4b03-8b69-46e49102539a-kube-api-access-v45zf\") pod \"neutron-db-create-rbcsx\" (UID: \"089bba13-506d-4b03-8b69-46e49102539a\") " pod="openstack/neutron-db-create-rbcsx" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.198893 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v45zf\" (UniqueName: \"kubernetes.io/projected/089bba13-506d-4b03-8b69-46e49102539a-kube-api-access-v45zf\") pod \"neutron-db-create-rbcsx\" (UID: \"089bba13-506d-4b03-8b69-46e49102539a\") " pod="openstack/neutron-db-create-rbcsx" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.330116 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbcsx" Oct 03 14:23:30 crc kubenswrapper[4962]: I1003 14:23:30.803303 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rbcsx"] Oct 03 14:23:31 crc kubenswrapper[4962]: I1003 14:23:31.303920 4962 generic.go:334] "Generic (PLEG): container finished" podID="089bba13-506d-4b03-8b69-46e49102539a" containerID="45f2d7888f216cb137768205351d3f09eecc33645aebd354d812696b2cb8502d" exitCode=0 Oct 03 14:23:31 crc kubenswrapper[4962]: I1003 14:23:31.303976 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbcsx" event={"ID":"089bba13-506d-4b03-8b69-46e49102539a","Type":"ContainerDied","Data":"45f2d7888f216cb137768205351d3f09eecc33645aebd354d812696b2cb8502d"} Oct 03 14:23:31 crc kubenswrapper[4962]: I1003 14:23:31.304009 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbcsx" event={"ID":"089bba13-506d-4b03-8b69-46e49102539a","Type":"ContainerStarted","Data":"7094b53da1323a27f2f925e49f4763b932a1a8e1ecce0557dee08dfe437eca00"} Oct 03 14:23:32 crc kubenswrapper[4962]: I1003 14:23:32.584987 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbcsx" Oct 03 14:23:32 crc kubenswrapper[4962]: I1003 14:23:32.625239 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v45zf\" (UniqueName: \"kubernetes.io/projected/089bba13-506d-4b03-8b69-46e49102539a-kube-api-access-v45zf\") pod \"089bba13-506d-4b03-8b69-46e49102539a\" (UID: \"089bba13-506d-4b03-8b69-46e49102539a\") " Oct 03 14:23:32 crc kubenswrapper[4962]: I1003 14:23:32.651004 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089bba13-506d-4b03-8b69-46e49102539a-kube-api-access-v45zf" (OuterVolumeSpecName: "kube-api-access-v45zf") pod "089bba13-506d-4b03-8b69-46e49102539a" (UID: "089bba13-506d-4b03-8b69-46e49102539a"). InnerVolumeSpecName "kube-api-access-v45zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:32 crc kubenswrapper[4962]: I1003 14:23:32.727159 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v45zf\" (UniqueName: \"kubernetes.io/projected/089bba13-506d-4b03-8b69-46e49102539a-kube-api-access-v45zf\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:33 crc kubenswrapper[4962]: I1003 14:23:33.319652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbcsx" event={"ID":"089bba13-506d-4b03-8b69-46e49102539a","Type":"ContainerDied","Data":"7094b53da1323a27f2f925e49f4763b932a1a8e1ecce0557dee08dfe437eca00"} Oct 03 14:23:33 crc kubenswrapper[4962]: I1003 14:23:33.319696 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7094b53da1323a27f2f925e49f4763b932a1a8e1ecce0557dee08dfe437eca00" Oct 03 14:23:33 crc kubenswrapper[4962]: I1003 14:23:33.319739 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbcsx" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.047070 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1dbf-account-create-tdd5n"] Oct 03 14:23:40 crc kubenswrapper[4962]: E1003 14:23:40.048030 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089bba13-506d-4b03-8b69-46e49102539a" containerName="mariadb-database-create" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.048048 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="089bba13-506d-4b03-8b69-46e49102539a" containerName="mariadb-database-create" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.048274 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="089bba13-506d-4b03-8b69-46e49102539a" containerName="mariadb-database-create" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.049039 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1dbf-account-create-tdd5n" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.051789 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.058246 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1dbf-account-create-tdd5n"] Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.144998 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwnw\" (UniqueName: \"kubernetes.io/projected/fce07167-a5cd-4c88-b07a-6daedc6888ff-kube-api-access-7rwnw\") pod \"neutron-1dbf-account-create-tdd5n\" (UID: \"fce07167-a5cd-4c88-b07a-6daedc6888ff\") " pod="openstack/neutron-1dbf-account-create-tdd5n" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.246246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwnw\" (UniqueName: \"kubernetes.io/projected/fce07167-a5cd-4c88-b07a-6daedc6888ff-kube-api-access-7rwnw\") pod \"neutron-1dbf-account-create-tdd5n\" (UID: \"fce07167-a5cd-4c88-b07a-6daedc6888ff\") " pod="openstack/neutron-1dbf-account-create-tdd5n" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.265067 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwnw\" (UniqueName: \"kubernetes.io/projected/fce07167-a5cd-4c88-b07a-6daedc6888ff-kube-api-access-7rwnw\") pod \"neutron-1dbf-account-create-tdd5n\" (UID: \"fce07167-a5cd-4c88-b07a-6daedc6888ff\") " pod="openstack/neutron-1dbf-account-create-tdd5n" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.369812 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1dbf-account-create-tdd5n" Oct 03 14:23:40 crc kubenswrapper[4962]: I1003 14:23:40.805854 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1dbf-account-create-tdd5n"] Oct 03 14:23:41 crc kubenswrapper[4962]: I1003 14:23:41.386493 4962 generic.go:334] "Generic (PLEG): container finished" podID="fce07167-a5cd-4c88-b07a-6daedc6888ff" containerID="97670cf30456e344ea9e34ccb387ae3ec34ac566de098185d45c066d9076cf5b" exitCode=0 Oct 03 14:23:41 crc kubenswrapper[4962]: I1003 14:23:41.386548 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1dbf-account-create-tdd5n" event={"ID":"fce07167-a5cd-4c88-b07a-6daedc6888ff","Type":"ContainerDied","Data":"97670cf30456e344ea9e34ccb387ae3ec34ac566de098185d45c066d9076cf5b"} Oct 03 14:23:41 crc kubenswrapper[4962]: I1003 14:23:41.386792 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1dbf-account-create-tdd5n" event={"ID":"fce07167-a5cd-4c88-b07a-6daedc6888ff","Type":"ContainerStarted","Data":"cbb2ada45107415f7901e0e3add534707959dc782958d1dff5207a56962bcc2b"} Oct 03 14:23:42 crc kubenswrapper[4962]: I1003 14:23:42.690911 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1dbf-account-create-tdd5n" Oct 03 14:23:42 crc kubenswrapper[4962]: I1003 14:23:42.795211 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rwnw\" (UniqueName: \"kubernetes.io/projected/fce07167-a5cd-4c88-b07a-6daedc6888ff-kube-api-access-7rwnw\") pod \"fce07167-a5cd-4c88-b07a-6daedc6888ff\" (UID: \"fce07167-a5cd-4c88-b07a-6daedc6888ff\") " Oct 03 14:23:42 crc kubenswrapper[4962]: I1003 14:23:42.800268 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce07167-a5cd-4c88-b07a-6daedc6888ff-kube-api-access-7rwnw" (OuterVolumeSpecName: "kube-api-access-7rwnw") pod "fce07167-a5cd-4c88-b07a-6daedc6888ff" (UID: "fce07167-a5cd-4c88-b07a-6daedc6888ff"). InnerVolumeSpecName "kube-api-access-7rwnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:42 crc kubenswrapper[4962]: I1003 14:23:42.898988 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rwnw\" (UniqueName: \"kubernetes.io/projected/fce07167-a5cd-4c88-b07a-6daedc6888ff-kube-api-access-7rwnw\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:43 crc kubenswrapper[4962]: I1003 14:23:43.409724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1dbf-account-create-tdd5n" event={"ID":"fce07167-a5cd-4c88-b07a-6daedc6888ff","Type":"ContainerDied","Data":"cbb2ada45107415f7901e0e3add534707959dc782958d1dff5207a56962bcc2b"} Oct 03 14:23:43 crc kubenswrapper[4962]: I1003 14:23:43.409761 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbb2ada45107415f7901e0e3add534707959dc782958d1dff5207a56962bcc2b" Oct 03 14:23:43 crc kubenswrapper[4962]: I1003 14:23:43.409831 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1dbf-account-create-tdd5n" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.373618 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qcmsk"] Oct 03 14:23:45 crc kubenswrapper[4962]: E1003 14:23:45.374180 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce07167-a5cd-4c88-b07a-6daedc6888ff" containerName="mariadb-account-create" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.374192 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce07167-a5cd-4c88-b07a-6daedc6888ff" containerName="mariadb-account-create" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.374361 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce07167-a5cd-4c88-b07a-6daedc6888ff" containerName="mariadb-account-create" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.374934 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.378531 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-w4fpr" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.378568 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.379205 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.387674 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qcmsk"] Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.447573 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-config\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.447918 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-combined-ca-bundle\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.448008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjf57\" (UniqueName: \"kubernetes.io/projected/44a69f85-585f-430d-bd92-311a41410a8b-kube-api-access-wjf57\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.549426 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-combined-ca-bundle\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.549753 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjf57\" (UniqueName: \"kubernetes.io/projected/44a69f85-585f-430d-bd92-311a41410a8b-kube-api-access-wjf57\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.549845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-config\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.555200 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-config\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.555201 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-combined-ca-bundle\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.568589 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjf57\" (UniqueName: \"kubernetes.io/projected/44a69f85-585f-430d-bd92-311a41410a8b-kube-api-access-wjf57\") pod \"neutron-db-sync-qcmsk\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:45 crc kubenswrapper[4962]: I1003 14:23:45.729036 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:46 crc kubenswrapper[4962]: I1003 14:23:46.185626 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qcmsk"] Oct 03 14:23:46 crc kubenswrapper[4962]: W1003 14:23:46.201863 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a69f85_585f_430d_bd92_311a41410a8b.slice/crio-22132341e1499fc1ab8deb28256e527c6011bcbdea45ec705c423b4d3147dfc3 WatchSource:0}: Error finding container 22132341e1499fc1ab8deb28256e527c6011bcbdea45ec705c423b4d3147dfc3: Status 404 returned error can't find the container with id 22132341e1499fc1ab8deb28256e527c6011bcbdea45ec705c423b4d3147dfc3 Oct 03 14:23:46 crc kubenswrapper[4962]: I1003 14:23:46.443650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qcmsk" event={"ID":"44a69f85-585f-430d-bd92-311a41410a8b","Type":"ContainerStarted","Data":"373228e79da2587786749685118a5a45e03f6a2255d81587683b66d99706a139"} Oct 03 14:23:46 crc kubenswrapper[4962]: I1003 14:23:46.444014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qcmsk" event={"ID":"44a69f85-585f-430d-bd92-311a41410a8b","Type":"ContainerStarted","Data":"22132341e1499fc1ab8deb28256e527c6011bcbdea45ec705c423b4d3147dfc3"} Oct 03 14:23:46 crc kubenswrapper[4962]: I1003 14:23:46.466752 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qcmsk" podStartSLOduration=1.466736399 podStartE2EDuration="1.466736399s" podCreationTimestamp="2025-10-03 14:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:46.462044597 +0000 UTC m=+5634.865942432" watchObservedRunningTime="2025-10-03 14:23:46.466736399 +0000 UTC m=+5634.870634244" Oct 03 14:23:50 crc kubenswrapper[4962]: I1003 14:23:50.483542 4962 generic.go:334] "Generic (PLEG): container finished" podID="44a69f85-585f-430d-bd92-311a41410a8b" containerID="373228e79da2587786749685118a5a45e03f6a2255d81587683b66d99706a139" exitCode=0 Oct 03 14:23:50 crc kubenswrapper[4962]: I1003 14:23:50.483653 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qcmsk" event={"ID":"44a69f85-585f-430d-bd92-311a41410a8b","Type":"ContainerDied","Data":"373228e79da2587786749685118a5a45e03f6a2255d81587683b66d99706a139"} Oct 03 14:23:51 crc kubenswrapper[4962]: I1003 14:23:51.876293 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:51 crc kubenswrapper[4962]: I1003 14:23:51.999442 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-config\") pod \"44a69f85-585f-430d-bd92-311a41410a8b\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " Oct 03 14:23:51 crc kubenswrapper[4962]: I1003 14:23:51.999490 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjf57\" (UniqueName: \"kubernetes.io/projected/44a69f85-585f-430d-bd92-311a41410a8b-kube-api-access-wjf57\") pod \"44a69f85-585f-430d-bd92-311a41410a8b\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " Oct 03 14:23:51 crc kubenswrapper[4962]: I1003 14:23:51.999740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-combined-ca-bundle\") pod \"44a69f85-585f-430d-bd92-311a41410a8b\" (UID: \"44a69f85-585f-430d-bd92-311a41410a8b\") " Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.027117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-config" (OuterVolumeSpecName: "config") pod "44a69f85-585f-430d-bd92-311a41410a8b" (UID: "44a69f85-585f-430d-bd92-311a41410a8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.029921 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a69f85-585f-430d-bd92-311a41410a8b-kube-api-access-wjf57" (OuterVolumeSpecName: "kube-api-access-wjf57") pod "44a69f85-585f-430d-bd92-311a41410a8b" (UID: "44a69f85-585f-430d-bd92-311a41410a8b"). InnerVolumeSpecName "kube-api-access-wjf57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.039437 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44a69f85-585f-430d-bd92-311a41410a8b" (UID: "44a69f85-585f-430d-bd92-311a41410a8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.102318 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.102381 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/44a69f85-585f-430d-bd92-311a41410a8b-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.102394 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjf57\" (UniqueName: \"kubernetes.io/projected/44a69f85-585f-430d-bd92-311a41410a8b-kube-api-access-wjf57\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.501594 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qcmsk" event={"ID":"44a69f85-585f-430d-bd92-311a41410a8b","Type":"ContainerDied","Data":"22132341e1499fc1ab8deb28256e527c6011bcbdea45ec705c423b4d3147dfc3"} Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.501934 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22132341e1499fc1ab8deb28256e527c6011bcbdea45ec705c423b4d3147dfc3" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.501673 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qcmsk" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.702530 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-646c67b865-cs96q"] Oct 03 14:23:52 crc kubenswrapper[4962]: E1003 14:23:52.702966 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a69f85-585f-430d-bd92-311a41410a8b" containerName="neutron-db-sync" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.702984 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a69f85-585f-430d-bd92-311a41410a8b" containerName="neutron-db-sync" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.703212 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a69f85-585f-430d-bd92-311a41410a8b" containerName="neutron-db-sync" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.704181 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.708937 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-646c67b865-cs96q"] Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.715977 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bf5469677-7cw9c"] Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.718211 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.720103 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-w4fpr" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.720602 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.720754 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.724355 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bf5469677-7cw9c"] Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.812819 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6pf\" (UniqueName: \"kubernetes.io/projected/66f387f8-5749-4493-9748-a9f0bb6352c1-kube-api-access-vx6pf\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.812889 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-httpd-config\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.812926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-nb\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.812975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-config\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.813055 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-combined-ca-bundle\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.813095 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95wtx\" (UniqueName: \"kubernetes.io/projected/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-kube-api-access-95wtx\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.813117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-config\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.813345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-sb\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.813451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-dns-svc\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.914744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-combined-ca-bundle\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.914798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95wtx\" (UniqueName: \"kubernetes.io/projected/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-kube-api-access-95wtx\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.914824 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-config\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.914888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-sb\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.914927 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-dns-svc\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.914975 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6pf\" (UniqueName: \"kubernetes.io/projected/66f387f8-5749-4493-9748-a9f0bb6352c1-kube-api-access-vx6pf\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.915015 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-httpd-config\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.915042 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-nb\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.915105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-config\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.915775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-sb\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.915863 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-config\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.916204 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-dns-svc\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.916464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-nb\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.922425 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-config\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.927295 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-httpd-config\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.927609 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-combined-ca-bundle\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.933765 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95wtx\" (UniqueName: \"kubernetes.io/projected/fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3-kube-api-access-95wtx\") pod \"neutron-6bf5469677-7cw9c\" (UID: \"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3\") " pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:52 crc kubenswrapper[4962]: I1003 14:23:52.936398 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6pf\" (UniqueName: \"kubernetes.io/projected/66f387f8-5749-4493-9748-a9f0bb6352c1-kube-api-access-vx6pf\") pod \"dnsmasq-dns-646c67b865-cs96q\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:53 crc kubenswrapper[4962]: I1003 14:23:53.022038 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:53 crc kubenswrapper[4962]: I1003 14:23:53.043284 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:53 crc kubenswrapper[4962]: I1003 14:23:53.519757 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-646c67b865-cs96q"] Oct 03 14:23:53 crc kubenswrapper[4962]: W1003 14:23:53.525937 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66f387f8_5749_4493_9748_a9f0bb6352c1.slice/crio-06529b70d0ece10f5cb79fd0a25b594045ca7e9e4f7817f2a03374f5c320b6a9 WatchSource:0}: Error finding container 06529b70d0ece10f5cb79fd0a25b594045ca7e9e4f7817f2a03374f5c320b6a9: Status 404 returned error can't find the container with id 06529b70d0ece10f5cb79fd0a25b594045ca7e9e4f7817f2a03374f5c320b6a9 Oct 03 14:23:53 crc kubenswrapper[4962]: I1003 14:23:53.708841 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bf5469677-7cw9c"] Oct 03 14:23:54 crc kubenswrapper[4962]: I1003 14:23:54.520063 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bf5469677-7cw9c" event={"ID":"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3","Type":"ContainerStarted","Data":"e513bc70b6eca647eb94eb97e8593769c422a57a2c9d70fb199a68ac8295e23b"} Oct 03 14:23:54 crc kubenswrapper[4962]: I1003 14:23:54.520322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bf5469677-7cw9c" event={"ID":"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3","Type":"ContainerStarted","Data":"12eae4beaa9cc84f2ae95f99f420e426a236398ade909ac853decc501bb9e854"} Oct 03 14:23:54 crc kubenswrapper[4962]: I1003 14:23:54.520335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bf5469677-7cw9c" event={"ID":"fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3","Type":"ContainerStarted","Data":"1728673703cbf02b845108366c76f781f06140805d4be0894f90b50228de9b0b"} Oct 03 14:23:54 crc kubenswrapper[4962]: I1003 14:23:54.520384 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:23:54 crc kubenswrapper[4962]: I1003 14:23:54.526391 4962 generic.go:334] "Generic (PLEG): container finished" podID="66f387f8-5749-4493-9748-a9f0bb6352c1" containerID="6c9eb6dbc628a16b1eacd5760adbced702cc1745bf43b5743d6a45db956a4918" exitCode=0 Oct 03 14:23:54 crc kubenswrapper[4962]: I1003 14:23:54.526468 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646c67b865-cs96q" event={"ID":"66f387f8-5749-4493-9748-a9f0bb6352c1","Type":"ContainerDied","Data":"6c9eb6dbc628a16b1eacd5760adbced702cc1745bf43b5743d6a45db956a4918"} Oct 03 14:23:54 crc kubenswrapper[4962]: I1003 14:23:54.526730 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646c67b865-cs96q" event={"ID":"66f387f8-5749-4493-9748-a9f0bb6352c1","Type":"ContainerStarted","Data":"06529b70d0ece10f5cb79fd0a25b594045ca7e9e4f7817f2a03374f5c320b6a9"} Oct 03 14:23:54 crc kubenswrapper[4962]: I1003 14:23:54.555970 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bf5469677-7cw9c" podStartSLOduration=2.555945039 podStartE2EDuration="2.555945039s" podCreationTimestamp="2025-10-03 14:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:54.544549611 +0000 UTC m=+5642.948447446" watchObservedRunningTime="2025-10-03 14:23:54.555945039 +0000 UTC m=+5642.959842874" Oct 03 14:23:55 crc kubenswrapper[4962]: I1003 14:23:55.535790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646c67b865-cs96q" event={"ID":"66f387f8-5749-4493-9748-a9f0bb6352c1","Type":"ContainerStarted","Data":"df56b8c4b474556c28208370edde8c85271a606b06ed92f520673dce62c32042"} Oct 03 14:23:55 crc kubenswrapper[4962]: I1003 14:23:55.560132 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-646c67b865-cs96q" podStartSLOduration=3.560114336 podStartE2EDuration="3.560114336s" podCreationTimestamp="2025-10-03 14:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:55.554803867 +0000 UTC m=+5643.958701712" watchObservedRunningTime="2025-10-03 14:23:55.560114336 +0000 UTC m=+5643.964012171" Oct 03 14:23:56 crc kubenswrapper[4962]: I1003 14:23:56.544035 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:23:57 crc kubenswrapper[4962]: E1003 14:23:57.472951 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a69f85_585f_430d_bd92_311a41410a8b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.023814 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.070892 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6874dd89cc-gzw44"] Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.071162 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" podUID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" containerName="dnsmasq-dns" containerID="cri-o://dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2" gracePeriod=10 Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.565284 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.594910 4962 generic.go:334] "Generic (PLEG): container finished" podID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" containerID="dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2" exitCode=0 Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.594959 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" event={"ID":"8d3f07b8-8ccc-42e1-a570-e9924e18d67a","Type":"ContainerDied","Data":"dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2"} Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.594992 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" event={"ID":"8d3f07b8-8ccc-42e1-a570-e9924e18d67a","Type":"ContainerDied","Data":"a383b7260d000de32474d993725678155052f2ead3609f6ed0ca3587a59d351f"} Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.595013 4962 scope.go:117] "RemoveContainer" containerID="dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.595007 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6874dd89cc-gzw44" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.622179 4962 scope.go:117] "RemoveContainer" containerID="d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.641236 4962 scope.go:117] "RemoveContainer" containerID="dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2" Oct 03 14:24:03 crc kubenswrapper[4962]: E1003 14:24:03.641825 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2\": container with ID starting with dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2 not found: ID does not exist" containerID="dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.641863 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2"} err="failed to get container status \"dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2\": rpc error: code = NotFound desc = could not find container \"dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2\": container with ID starting with dc23d3aedf1b9deaa79c7b82720ee2bd432d71729accacac5487fd22560678e2 not found: ID does not exist" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.641888 4962 scope.go:117] "RemoveContainer" containerID="d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2" Oct 03 14:24:03 crc kubenswrapper[4962]: E1003 14:24:03.642294 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2\": container with ID starting with d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2 not found: ID does not exist" containerID="d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.642322 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2"} err="failed to get container status \"d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2\": rpc error: code = NotFound desc = could not find container \"d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2\": container with ID starting with d4f4082f870708a2fcc272fa0b1e796578d12133405040ea55c777e70faddeb2 not found: ID does not exist" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.712350 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-config\") pod \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.712482 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-nb\") pod \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.712546 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-dns-svc\") pod \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.712577 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-sb\") pod \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.712615 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqws9\" (UniqueName: \"kubernetes.io/projected/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-kube-api-access-cqws9\") pod \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\" (UID: \"8d3f07b8-8ccc-42e1-a570-e9924e18d67a\") " Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.718125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-kube-api-access-cqws9" (OuterVolumeSpecName: "kube-api-access-cqws9") pod "8d3f07b8-8ccc-42e1-a570-e9924e18d67a" (UID: "8d3f07b8-8ccc-42e1-a570-e9924e18d67a"). InnerVolumeSpecName "kube-api-access-cqws9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.754919 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-config" (OuterVolumeSpecName: "config") pod "8d3f07b8-8ccc-42e1-a570-e9924e18d67a" (UID: "8d3f07b8-8ccc-42e1-a570-e9924e18d67a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.761208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d3f07b8-8ccc-42e1-a570-e9924e18d67a" (UID: "8d3f07b8-8ccc-42e1-a570-e9924e18d67a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.769721 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d3f07b8-8ccc-42e1-a570-e9924e18d67a" (UID: "8d3f07b8-8ccc-42e1-a570-e9924e18d67a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.773679 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d3f07b8-8ccc-42e1-a570-e9924e18d67a" (UID: "8d3f07b8-8ccc-42e1-a570-e9924e18d67a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.814921 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqws9\" (UniqueName: \"kubernetes.io/projected/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-kube-api-access-cqws9\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.814963 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.814973 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.814981 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.814991 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d3f07b8-8ccc-42e1-a570-e9924e18d67a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.944962 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6874dd89cc-gzw44"] Oct 03 14:24:03 crc kubenswrapper[4962]: I1003 14:24:03.954375 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6874dd89cc-gzw44"] Oct 03 14:24:04 crc kubenswrapper[4962]: I1003 14:24:04.236359 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" path="/var/lib/kubelet/pods/8d3f07b8-8ccc-42e1-a570-e9924e18d67a/volumes" Oct 03 14:24:07 crc kubenswrapper[4962]: E1003 14:24:07.690842 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a69f85_585f_430d_bd92_311a41410a8b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 14:24:17 crc kubenswrapper[4962]: E1003 14:24:17.896458 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a69f85_585f_430d_bd92_311a41410a8b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 14:24:23 crc kubenswrapper[4962]: I1003 14:24:23.052297 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bf5469677-7cw9c" Oct 03 14:24:28 crc kubenswrapper[4962]: E1003 14:24:28.088732 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a69f85_585f_430d_bd92_311a41410a8b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.597938 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-xr8j4"] Oct 03 14:24:30 crc kubenswrapper[4962]: E1003 14:24:30.599418 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" containerName="init" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.599495 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" containerName="init" Oct 03 14:24:30 crc kubenswrapper[4962]: E1003 14:24:30.599577 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.599652 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.599886 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d3f07b8-8ccc-42e1-a570-e9924e18d67a" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.600512 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xr8j4" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.606534 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xr8j4"] Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.673257 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6jq\" (UniqueName: \"kubernetes.io/projected/d35a75ca-500f-46c6-b933-02e980a88fe5-kube-api-access-pm6jq\") pod \"glance-db-create-xr8j4\" (UID: \"d35a75ca-500f-46c6-b933-02e980a88fe5\") " pod="openstack/glance-db-create-xr8j4" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.775037 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6jq\" (UniqueName: \"kubernetes.io/projected/d35a75ca-500f-46c6-b933-02e980a88fe5-kube-api-access-pm6jq\") pod \"glance-db-create-xr8j4\" (UID: \"d35a75ca-500f-46c6-b933-02e980a88fe5\") " pod="openstack/glance-db-create-xr8j4" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.795337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6jq\" (UniqueName: \"kubernetes.io/projected/d35a75ca-500f-46c6-b933-02e980a88fe5-kube-api-access-pm6jq\") pod \"glance-db-create-xr8j4\" (UID: \"d35a75ca-500f-46c6-b933-02e980a88fe5\") " pod="openstack/glance-db-create-xr8j4" Oct 03 14:24:30 crc kubenswrapper[4962]: I1003 14:24:30.920055 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xr8j4" Oct 03 14:24:31 crc kubenswrapper[4962]: I1003 14:24:31.402712 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xr8j4"] Oct 03 14:24:31 crc kubenswrapper[4962]: I1003 14:24:31.813061 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xr8j4" event={"ID":"d35a75ca-500f-46c6-b933-02e980a88fe5","Type":"ContainerStarted","Data":"69643303cefdd941ae859ff917a25e89598c7a5baa3be5eb0c60c200b92e8c4d"} Oct 03 14:24:31 crc kubenswrapper[4962]: I1003 14:24:31.813097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xr8j4" event={"ID":"d35a75ca-500f-46c6-b933-02e980a88fe5","Type":"ContainerStarted","Data":"15210a05a020532c56f5ff0da63e2a428c12ee91a79202e174d07a15087d2a96"} Oct 03 14:24:31 crc kubenswrapper[4962]: I1003 14:24:31.831556 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-xr8j4" podStartSLOduration=1.83153465 podStartE2EDuration="1.83153465s" podCreationTimestamp="2025-10-03 14:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:31.828652785 +0000 UTC m=+5680.232550630" watchObservedRunningTime="2025-10-03 14:24:31.83153465 +0000 UTC m=+5680.235432485" Oct 03 14:24:32 crc kubenswrapper[4962]: I1003 14:24:32.820725 4962 generic.go:334] "Generic (PLEG): container finished" podID="d35a75ca-500f-46c6-b933-02e980a88fe5" containerID="69643303cefdd941ae859ff917a25e89598c7a5baa3be5eb0c60c200b92e8c4d" exitCode=0 Oct 03 14:24:32 crc kubenswrapper[4962]: I1003 14:24:32.820984 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xr8j4" event={"ID":"d35a75ca-500f-46c6-b933-02e980a88fe5","Type":"ContainerDied","Data":"69643303cefdd941ae859ff917a25e89598c7a5baa3be5eb0c60c200b92e8c4d"} Oct 03 14:24:34 crc kubenswrapper[4962]: I1003 14:24:34.222465 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xr8j4" Oct 03 14:24:34 crc kubenswrapper[4962]: I1003 14:24:34.335478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm6jq\" (UniqueName: \"kubernetes.io/projected/d35a75ca-500f-46c6-b933-02e980a88fe5-kube-api-access-pm6jq\") pod \"d35a75ca-500f-46c6-b933-02e980a88fe5\" (UID: \"d35a75ca-500f-46c6-b933-02e980a88fe5\") " Oct 03 14:24:34 crc kubenswrapper[4962]: I1003 14:24:34.346036 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35a75ca-500f-46c6-b933-02e980a88fe5-kube-api-access-pm6jq" (OuterVolumeSpecName: "kube-api-access-pm6jq") pod "d35a75ca-500f-46c6-b933-02e980a88fe5" (UID: "d35a75ca-500f-46c6-b933-02e980a88fe5"). InnerVolumeSpecName "kube-api-access-pm6jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:34 crc kubenswrapper[4962]: I1003 14:24:34.438470 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm6jq\" (UniqueName: \"kubernetes.io/projected/d35a75ca-500f-46c6-b933-02e980a88fe5-kube-api-access-pm6jq\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:34 crc kubenswrapper[4962]: I1003 14:24:34.836901 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xr8j4" event={"ID":"d35a75ca-500f-46c6-b933-02e980a88fe5","Type":"ContainerDied","Data":"15210a05a020532c56f5ff0da63e2a428c12ee91a79202e174d07a15087d2a96"} Oct 03 14:24:34 crc kubenswrapper[4962]: I1003 14:24:34.836941 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15210a05a020532c56f5ff0da63e2a428c12ee91a79202e174d07a15087d2a96" Oct 03 14:24:34 crc kubenswrapper[4962]: I1003 14:24:34.836966 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xr8j4" Oct 03 14:24:38 crc kubenswrapper[4962]: E1003 14:24:38.324168 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a69f85_585f_430d_bd92_311a41410a8b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.726183 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4674-account-create-g99l4"] Oct 03 14:24:40 crc kubenswrapper[4962]: E1003 14:24:40.727124 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35a75ca-500f-46c6-b933-02e980a88fe5" containerName="mariadb-database-create" Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.727148 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35a75ca-500f-46c6-b933-02e980a88fe5" containerName="mariadb-database-create" Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.727476 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d35a75ca-500f-46c6-b933-02e980a88fe5" containerName="mariadb-database-create" Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.728296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4674-account-create-g99l4" Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.731264 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.734820 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4674-account-create-g99l4"] Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.753599 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh8pn\" (UniqueName: \"kubernetes.io/projected/db99eb0e-0fce-44df-8c81-19936bc939e1-kube-api-access-sh8pn\") pod \"glance-4674-account-create-g99l4\" (UID: \"db99eb0e-0fce-44df-8c81-19936bc939e1\") " pod="openstack/glance-4674-account-create-g99l4" Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.855964 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh8pn\" (UniqueName: \"kubernetes.io/projected/db99eb0e-0fce-44df-8c81-19936bc939e1-kube-api-access-sh8pn\") pod \"glance-4674-account-create-g99l4\" (UID: \"db99eb0e-0fce-44df-8c81-19936bc939e1\") " pod="openstack/glance-4674-account-create-g99l4" Oct 03 14:24:40 crc kubenswrapper[4962]: I1003 14:24:40.878631 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh8pn\" (UniqueName: \"kubernetes.io/projected/db99eb0e-0fce-44df-8c81-19936bc939e1-kube-api-access-sh8pn\") pod \"glance-4674-account-create-g99l4\" (UID: \"db99eb0e-0fce-44df-8c81-19936bc939e1\") " pod="openstack/glance-4674-account-create-g99l4" Oct 03 14:24:41 crc kubenswrapper[4962]: I1003 14:24:41.048341 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4674-account-create-g99l4" Oct 03 14:24:41 crc kubenswrapper[4962]: I1003 14:24:41.485223 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4674-account-create-g99l4"] Oct 03 14:24:41 crc kubenswrapper[4962]: I1003 14:24:41.888214 4962 generic.go:334] "Generic (PLEG): container finished" podID="db99eb0e-0fce-44df-8c81-19936bc939e1" containerID="2f5b5a163b992f0535467c25207302cf628d40694517c4508c3788dbf65d42d9" exitCode=0 Oct 03 14:24:41 crc kubenswrapper[4962]: I1003 14:24:41.888302 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4674-account-create-g99l4" event={"ID":"db99eb0e-0fce-44df-8c81-19936bc939e1","Type":"ContainerDied","Data":"2f5b5a163b992f0535467c25207302cf628d40694517c4508c3788dbf65d42d9"} Oct 03 14:24:41 crc kubenswrapper[4962]: I1003 14:24:41.888328 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4674-account-create-g99l4" event={"ID":"db99eb0e-0fce-44df-8c81-19936bc939e1","Type":"ContainerStarted","Data":"162955f28d3a71e1a8cf9bdff7ad01e4d11b15edf3aab48b4a284c50d72c7a85"} Oct 03 14:24:43 crc kubenswrapper[4962]: I1003 14:24:43.329747 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4674-account-create-g99l4" Oct 03 14:24:43 crc kubenswrapper[4962]: I1003 14:24:43.495450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh8pn\" (UniqueName: \"kubernetes.io/projected/db99eb0e-0fce-44df-8c81-19936bc939e1-kube-api-access-sh8pn\") pod \"db99eb0e-0fce-44df-8c81-19936bc939e1\" (UID: \"db99eb0e-0fce-44df-8c81-19936bc939e1\") " Oct 03 14:24:43 crc kubenswrapper[4962]: I1003 14:24:43.501950 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db99eb0e-0fce-44df-8c81-19936bc939e1-kube-api-access-sh8pn" (OuterVolumeSpecName: "kube-api-access-sh8pn") pod "db99eb0e-0fce-44df-8c81-19936bc939e1" (UID: "db99eb0e-0fce-44df-8c81-19936bc939e1"). InnerVolumeSpecName "kube-api-access-sh8pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:43 crc kubenswrapper[4962]: I1003 14:24:43.597493 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh8pn\" (UniqueName: \"kubernetes.io/projected/db99eb0e-0fce-44df-8c81-19936bc939e1-kube-api-access-sh8pn\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:43 crc kubenswrapper[4962]: I1003 14:24:43.909774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4674-account-create-g99l4" event={"ID":"db99eb0e-0fce-44df-8c81-19936bc939e1","Type":"ContainerDied","Data":"162955f28d3a71e1a8cf9bdff7ad01e4d11b15edf3aab48b4a284c50d72c7a85"} Oct 03 14:24:43 crc kubenswrapper[4962]: I1003 14:24:43.910111 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="162955f28d3a71e1a8cf9bdff7ad01e4d11b15edf3aab48b4a284c50d72c7a85" Oct 03 14:24:43 crc kubenswrapper[4962]: I1003 14:24:43.909906 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4674-account-create-g99l4" Oct 03 14:24:45 crc kubenswrapper[4962]: I1003 14:24:45.856805 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lblt6"] Oct 03 14:24:45 crc kubenswrapper[4962]: E1003 14:24:45.857306 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db99eb0e-0fce-44df-8c81-19936bc939e1" containerName="mariadb-account-create" Oct 03 14:24:45 crc kubenswrapper[4962]: I1003 14:24:45.857323 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="db99eb0e-0fce-44df-8c81-19936bc939e1" containerName="mariadb-account-create" Oct 03 14:24:45 crc kubenswrapper[4962]: I1003 14:24:45.857540 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="db99eb0e-0fce-44df-8c81-19936bc939e1" containerName="mariadb-account-create" Oct 03 14:24:45 crc kubenswrapper[4962]: I1003 14:24:45.858212 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:45 crc kubenswrapper[4962]: I1003 14:24:45.861076 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4pr5m" Oct 03 14:24:45 crc kubenswrapper[4962]: I1003 14:24:45.867298 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 14:24:45 crc kubenswrapper[4962]: I1003 14:24:45.915349 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lblt6"] Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.037227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-config-data\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.037348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9b6v\" (UniqueName: \"kubernetes.io/projected/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-kube-api-access-p9b6v\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.037726 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-db-sync-config-data\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.037877 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-combined-ca-bundle\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.140357 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9b6v\" (UniqueName: \"kubernetes.io/projected/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-kube-api-access-p9b6v\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.140570 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-db-sync-config-data\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.140629 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-combined-ca-bundle\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.140707 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-config-data\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.147988 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-db-sync-config-data\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.158771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-config-data\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.168523 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9b6v\" (UniqueName: \"kubernetes.io/projected/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-kube-api-access-p9b6v\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.189583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-combined-ca-bundle\") pod \"glance-db-sync-lblt6\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.211802 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.819021 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lblt6"] Oct 03 14:24:46 crc kubenswrapper[4962]: I1003 14:24:46.939541 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lblt6" event={"ID":"fa6b2f6a-d9bd-456a-9842-b3975ea2a778","Type":"ContainerStarted","Data":"0871e6c1cf78c9479da26a3c883dd6d87167c51c77edd185e3d027e6461fd0ee"} Oct 03 14:24:47 crc kubenswrapper[4962]: I1003 14:24:47.949651 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lblt6" event={"ID":"fa6b2f6a-d9bd-456a-9842-b3975ea2a778","Type":"ContainerStarted","Data":"9e9853734eb445883ad10f8e551909f5dd2d34ba9ed862ce3c6bca2cc1514f1b"} Oct 03 14:24:47 crc kubenswrapper[4962]: I1003 14:24:47.964872 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lblt6" podStartSLOduration=2.964855723 podStartE2EDuration="2.964855723s" podCreationTimestamp="2025-10-03 14:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:47.960833837 +0000 UTC m=+5696.364731682" watchObservedRunningTime="2025-10-03 14:24:47.964855723 +0000 UTC m=+5696.368753558" Oct 03 14:24:48 crc kubenswrapper[4962]: E1003 14:24:48.555795 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a69f85_585f_430d_bd92_311a41410a8b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 14:24:51 crc kubenswrapper[4962]: I1003 14:24:51.986044 4962 generic.go:334] "Generic (PLEG): container finished" podID="fa6b2f6a-d9bd-456a-9842-b3975ea2a778" containerID="9e9853734eb445883ad10f8e551909f5dd2d34ba9ed862ce3c6bca2cc1514f1b" exitCode=0 Oct 03 14:24:51 crc kubenswrapper[4962]: I1003 14:24:51.986161 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lblt6" event={"ID":"fa6b2f6a-d9bd-456a-9842-b3975ea2a778","Type":"ContainerDied","Data":"9e9853734eb445883ad10f8e551909f5dd2d34ba9ed862ce3c6bca2cc1514f1b"} Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.340353 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.482931 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-db-sync-config-data\") pod \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.483092 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-combined-ca-bundle\") pod \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.483172 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-config-data\") pod \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.483258 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9b6v\" (UniqueName: \"kubernetes.io/projected/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-kube-api-access-p9b6v\") pod \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\" (UID: \"fa6b2f6a-d9bd-456a-9842-b3975ea2a778\") " Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.488860 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fa6b2f6a-d9bd-456a-9842-b3975ea2a778" (UID: "fa6b2f6a-d9bd-456a-9842-b3975ea2a778"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.490884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-kube-api-access-p9b6v" (OuterVolumeSpecName: "kube-api-access-p9b6v") pod "fa6b2f6a-d9bd-456a-9842-b3975ea2a778" (UID: "fa6b2f6a-d9bd-456a-9842-b3975ea2a778"). InnerVolumeSpecName "kube-api-access-p9b6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.506554 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa6b2f6a-d9bd-456a-9842-b3975ea2a778" (UID: "fa6b2f6a-d9bd-456a-9842-b3975ea2a778"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.526207 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-config-data" (OuterVolumeSpecName: "config-data") pod "fa6b2f6a-d9bd-456a-9842-b3975ea2a778" (UID: "fa6b2f6a-d9bd-456a-9842-b3975ea2a778"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.584864 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.584902 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9b6v\" (UniqueName: \"kubernetes.io/projected/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-kube-api-access-p9b6v\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.584912 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:53 crc kubenswrapper[4962]: I1003 14:24:53.584922 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6b2f6a-d9bd-456a-9842-b3975ea2a778-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.000500 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lblt6" event={"ID":"fa6b2f6a-d9bd-456a-9842-b3975ea2a778","Type":"ContainerDied","Data":"0871e6c1cf78c9479da26a3c883dd6d87167c51c77edd185e3d027e6461fd0ee"} Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.000794 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0871e6c1cf78c9479da26a3c883dd6d87167c51c77edd185e3d027e6461fd0ee" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.000538 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lblt6" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.289530 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:54 crc kubenswrapper[4962]: E1003 14:24:54.290046 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6b2f6a-d9bd-456a-9842-b3975ea2a778" containerName="glance-db-sync" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.290063 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6b2f6a-d9bd-456a-9842-b3975ea2a778" containerName="glance-db-sync" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.290286 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6b2f6a-d9bd-456a-9842-b3975ea2a778" containerName="glance-db-sync" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.291870 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.294810 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.295012 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4pr5m" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.295237 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.297275 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.314672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.403102 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7j86\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-kube-api-access-f7j86\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.403173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-config-data\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.403203 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-ceph\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.403267 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-logs\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.403318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.403366 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.403408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-scripts\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.418392 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f5596d8dc-rzq76"] Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.445341 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.457611 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f5596d8dc-rzq76"] Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.480993 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.482798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.485994 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.512219 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-logs\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518385 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518602 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-config\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518749 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-dns-svc\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518775 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmg4d\" (UniqueName: \"kubernetes.io/projected/7034962f-7ee1-44f1-8dba-bae0bdb8911d-kube-api-access-bmg4d\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-scripts\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518920 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7j86\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-kube-api-access-f7j86\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.518976 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-config-data\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.519023 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-ceph\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.519761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-logs\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.529932 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-ceph\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.530700 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.531603 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-config-data\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.544120 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-scripts\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.544818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7j86\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-kube-api-access-f7j86\") pod \"glance-default-external-api-0\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620337 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620693 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620754 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620786 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620820 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620909 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.620975 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-config\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.621044 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wws2q\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-kube-api-access-wws2q\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.621072 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-dns-svc\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.621095 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmg4d\" (UniqueName: \"kubernetes.io/projected/7034962f-7ee1-44f1-8dba-bae0bdb8911d-kube-api-access-bmg4d\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.621749 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.621784 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.622047 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-config\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.622239 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-dns-svc\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.629078 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.644857 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmg4d\" (UniqueName: \"kubernetes.io/projected/7034962f-7ee1-44f1-8dba-bae0bdb8911d-kube-api-access-bmg4d\") pod \"dnsmasq-dns-6f5596d8dc-rzq76\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.723248 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.723327 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.723400 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.723423 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.723458 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wws2q\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-kube-api-access-wws2q\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.723510 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.723874 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.724258 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.724317 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.727233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.727785 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.732589 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.733877 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.748997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wws2q\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-kube-api-access-wws2q\") pod \"glance-default-internal-api-0\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.788765 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:54 crc kubenswrapper[4962]: I1003 14:24:54.818265 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:24:55 crc kubenswrapper[4962]: I1003 14:24:55.089595 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:55 crc kubenswrapper[4962]: I1003 14:24:55.427509 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f5596d8dc-rzq76"] Oct 03 14:24:55 crc kubenswrapper[4962]: W1003 14:24:55.436871 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7034962f_7ee1_44f1_8dba_bae0bdb8911d.slice/crio-4b96b8ccd5a25f0e6f222d354d4f49e0d79e29ea0a9c61aa76688a3ccfc1ed13 WatchSource:0}: Error finding container 4b96b8ccd5a25f0e6f222d354d4f49e0d79e29ea0a9c61aa76688a3ccfc1ed13: Status 404 returned error can't find the container with id 4b96b8ccd5a25f0e6f222d354d4f49e0d79e29ea0a9c61aa76688a3ccfc1ed13 Oct 03 14:24:55 crc kubenswrapper[4962]: I1003 14:24:55.483824 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:24:55 crc kubenswrapper[4962]: W1003 14:24:55.501411 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1beb556_4304_426c_abe8_0cdd5982362f.slice/crio-3c161dc7ad6d7b17e131e7b471b11ed20286f0eee926e3e8d171883e19a21873 WatchSource:0}: Error finding container 3c161dc7ad6d7b17e131e7b471b11ed20286f0eee926e3e8d171883e19a21873: Status 404 returned error can't find the container with id 3c161dc7ad6d7b17e131e7b471b11ed20286f0eee926e3e8d171883e19a21873 Oct 03 14:24:56 crc kubenswrapper[4962]: I1003 14:24:56.030870 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eaeeaea5-32cf-42bb-97cf-10c8fea2e026","Type":"ContainerStarted","Data":"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9"} Oct 03 14:24:56 crc kubenswrapper[4962]: I1003 14:24:56.031230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eaeeaea5-32cf-42bb-97cf-10c8fea2e026","Type":"ContainerStarted","Data":"a13b1d14b8c29b970c245ff10208b2dad13886d575c7275d14a25eb5eb654398"} Oct 03 14:24:56 crc kubenswrapper[4962]: I1003 14:24:56.035658 4962 generic.go:334] "Generic (PLEG): container finished" podID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" containerID="12f3d9673259981e6776e51544fe5efb5ce1da57a8056cee9f5b866b2fef856a" exitCode=0 Oct 03 14:24:56 crc kubenswrapper[4962]: I1003 14:24:56.035727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" event={"ID":"7034962f-7ee1-44f1-8dba-bae0bdb8911d","Type":"ContainerDied","Data":"12f3d9673259981e6776e51544fe5efb5ce1da57a8056cee9f5b866b2fef856a"} Oct 03 14:24:56 crc kubenswrapper[4962]: I1003 14:24:56.035756 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" event={"ID":"7034962f-7ee1-44f1-8dba-bae0bdb8911d","Type":"ContainerStarted","Data":"4b96b8ccd5a25f0e6f222d354d4f49e0d79e29ea0a9c61aa76688a3ccfc1ed13"} Oct 03 14:24:56 crc kubenswrapper[4962]: I1003 14:24:56.039945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1beb556-4304-426c-abe8-0cdd5982362f","Type":"ContainerStarted","Data":"3c161dc7ad6d7b17e131e7b471b11ed20286f0eee926e3e8d171883e19a21873"} Oct 03 14:24:56 crc kubenswrapper[4962]: I1003 14:24:56.068303 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:57 crc kubenswrapper[4962]: I1003 14:24:57.049579 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eaeeaea5-32cf-42bb-97cf-10c8fea2e026","Type":"ContainerStarted","Data":"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1"} Oct 03 14:24:57 crc kubenswrapper[4962]: I1003 14:24:57.049731 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerName="glance-log" containerID="cri-o://0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9" gracePeriod=30 Oct 03 14:24:57 crc kubenswrapper[4962]: I1003 14:24:57.049759 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerName="glance-httpd" containerID="cri-o://fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1" gracePeriod=30 Oct 03 14:24:57 crc kubenswrapper[4962]: I1003 14:24:57.053327 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" event={"ID":"7034962f-7ee1-44f1-8dba-bae0bdb8911d","Type":"ContainerStarted","Data":"386b0e4df84705e33b62dcd6c26a4882c64afa61356cf7ec04674a1990d9aa7d"} Oct 03 14:24:57 crc kubenswrapper[4962]: I1003 14:24:57.053456 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:24:57 crc kubenswrapper[4962]: I1003 14:24:57.054829 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1beb556-4304-426c-abe8-0cdd5982362f","Type":"ContainerStarted","Data":"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55"} Oct 03 14:24:57 crc kubenswrapper[4962]: I1003 14:24:57.074505 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.074484128 podStartE2EDuration="3.074484128s" podCreationTimestamp="2025-10-03 14:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:57.069353234 +0000 UTC m=+5705.473251079" watchObservedRunningTime="2025-10-03 14:24:57.074484128 +0000 UTC m=+5705.478381963" Oct 03 14:24:57 crc kubenswrapper[4962]: I1003 14:24:57.091211 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" podStartSLOduration=3.091186055 podStartE2EDuration="3.091186055s" podCreationTimestamp="2025-10-03 14:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:57.086311618 +0000 UTC m=+5705.490209473" watchObservedRunningTime="2025-10-03 14:24:57.091186055 +0000 UTC m=+5705.495083890" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.047330 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.075840 4962 generic.go:334] "Generic (PLEG): container finished" podID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerID="fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1" exitCode=143 Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.075874 4962 generic.go:334] "Generic (PLEG): container finished" podID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerID="0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9" exitCode=143 Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.075918 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eaeeaea5-32cf-42bb-97cf-10c8fea2e026","Type":"ContainerDied","Data":"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1"} Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.075946 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eaeeaea5-32cf-42bb-97cf-10c8fea2e026","Type":"ContainerDied","Data":"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9"} Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.075956 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eaeeaea5-32cf-42bb-97cf-10c8fea2e026","Type":"ContainerDied","Data":"a13b1d14b8c29b970c245ff10208b2dad13886d575c7275d14a25eb5eb654398"} Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.075979 4962 scope.go:117] "RemoveContainer" containerID="fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.076090 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.095023 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1beb556-4304-426c-abe8-0cdd5982362f","Type":"ContainerStarted","Data":"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104"} Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.120255 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.120233702 podStartE2EDuration="4.120233702s" podCreationTimestamp="2025-10-03 14:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:58.113125376 +0000 UTC m=+5706.517023211" watchObservedRunningTime="2025-10-03 14:24:58.120233702 +0000 UTC m=+5706.524131537" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.137582 4962 scope.go:117] "RemoveContainer" containerID="0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.157051 4962 scope.go:117] "RemoveContainer" containerID="fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1" Oct 03 14:24:58 crc kubenswrapper[4962]: E1003 14:24:58.157566 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1\": container with ID starting with fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1 not found: ID does not exist" containerID="fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.157617 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1"} err="failed to get container status \"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1\": rpc error: code = NotFound desc = could not find container \"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1\": container with ID starting with fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1 not found: ID does not exist" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.157670 4962 scope.go:117] "RemoveContainer" containerID="0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9" Oct 03 14:24:58 crc kubenswrapper[4962]: E1003 14:24:58.158072 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9\": container with ID starting with 0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9 not found: ID does not exist" containerID="0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.158112 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9"} err="failed to get container status \"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9\": rpc error: code = NotFound desc = could not find container \"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9\": container with ID starting with 0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9 not found: ID does not exist" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.158140 4962 scope.go:117] "RemoveContainer" containerID="fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.158527 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1"} err="failed to get container status \"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1\": rpc error: code = NotFound desc = could not find container \"fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1\": container with ID starting with fff28f1d59472c9b2a625a7fdeec3b415db24d7a2fa50d241fd4709342aea8e1 not found: ID does not exist" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.158567 4962 scope.go:117] "RemoveContainer" containerID="0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.158943 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9"} err="failed to get container status \"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9\": rpc error: code = NotFound desc = could not find container \"0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9\": container with ID starting with 0799e6389d9cbb2284a14328b852dfdd0a91438226541ed2a91dce51a7e157a9 not found: ID does not exist" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.191949 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-scripts\") pod \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.192094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-config-data\") pod \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.192241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7j86\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-kube-api-access-f7j86\") pod \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.192355 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-httpd-run\") pod \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.192384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-combined-ca-bundle\") pod \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.192412 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-ceph\") pod \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.192441 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-logs\") pod \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\" (UID: \"eaeeaea5-32cf-42bb-97cf-10c8fea2e026\") " Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.194044 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eaeeaea5-32cf-42bb-97cf-10c8fea2e026" (UID: "eaeeaea5-32cf-42bb-97cf-10c8fea2e026"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.194899 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-logs" (OuterVolumeSpecName: "logs") pod "eaeeaea5-32cf-42bb-97cf-10c8fea2e026" (UID: "eaeeaea5-32cf-42bb-97cf-10c8fea2e026"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.198858 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-scripts" (OuterVolumeSpecName: "scripts") pod "eaeeaea5-32cf-42bb-97cf-10c8fea2e026" (UID: "eaeeaea5-32cf-42bb-97cf-10c8fea2e026"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.198882 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-ceph" (OuterVolumeSpecName: "ceph") pod "eaeeaea5-32cf-42bb-97cf-10c8fea2e026" (UID: "eaeeaea5-32cf-42bb-97cf-10c8fea2e026"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.211412 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-kube-api-access-f7j86" (OuterVolumeSpecName: "kube-api-access-f7j86") pod "eaeeaea5-32cf-42bb-97cf-10c8fea2e026" (UID: "eaeeaea5-32cf-42bb-97cf-10c8fea2e026"). InnerVolumeSpecName "kube-api-access-f7j86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.217441 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaeeaea5-32cf-42bb-97cf-10c8fea2e026" (UID: "eaeeaea5-32cf-42bb-97cf-10c8fea2e026"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.254712 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-config-data" (OuterVolumeSpecName: "config-data") pod "eaeeaea5-32cf-42bb-97cf-10c8fea2e026" (UID: "eaeeaea5-32cf-42bb-97cf-10c8fea2e026"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.295473 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.295537 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.295577 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.295589 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.295601 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.295615 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.295686 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7j86\" (UniqueName: \"kubernetes.io/projected/eaeeaea5-32cf-42bb-97cf-10c8fea2e026-kube-api-access-f7j86\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.416399 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.423751 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.439484 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:58 crc kubenswrapper[4962]: E1003 14:24:58.439941 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerName="glance-httpd" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.439966 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerName="glance-httpd" Oct 03 14:24:58 crc kubenswrapper[4962]: E1003 14:24:58.440009 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerName="glance-log" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.440018 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerName="glance-log" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.440230 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerName="glance-httpd" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.440256 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" containerName="glance-log" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.441333 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.443891 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.455971 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.501016 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.501087 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.501151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.501175 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-logs\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.501342 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbzrt\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-kube-api-access-mbzrt\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.501390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.501443 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-ceph\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.602722 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbzrt\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-kube-api-access-mbzrt\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.602782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.602812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-ceph\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.602908 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.602947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.602992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.603014 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-logs\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.603595 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-logs\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.603697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.608240 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.608336 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.608803 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-ceph\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.614878 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.621904 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbzrt\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-kube-api-access-mbzrt\") pod \"glance-default-external-api-0\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.763916 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-np2k7"] Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.765667 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.771003 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.775430 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-np2k7"] Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.818223 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.912011 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-utilities\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.912599 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w68h8\" (UniqueName: \"kubernetes.io/projected/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-kube-api-access-w68h8\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:58 crc kubenswrapper[4962]: I1003 14:24:58.912625 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-catalog-content\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.016266 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-utilities\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.016409 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w68h8\" (UniqueName: \"kubernetes.io/projected/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-kube-api-access-w68h8\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.016435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-catalog-content\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.016955 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-utilities\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.017042 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-catalog-content\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.053065 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w68h8\" (UniqueName: \"kubernetes.io/projected/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-kube-api-access-w68h8\") pod \"community-operators-np2k7\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.084353 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.406692 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:24:59 crc kubenswrapper[4962]: W1003 14:24:59.408277 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c967c3d_1f67_42ff_9849_dcd585648bb8.slice/crio-5fd722d43f15ab83b3a8c49abe9530a5c27ee768fef9cc6cd388242ac41383b2 WatchSource:0}: Error finding container 5fd722d43f15ab83b3a8c49abe9530a5c27ee768fef9cc6cd388242ac41383b2: Status 404 returned error can't find the container with id 5fd722d43f15ab83b3a8c49abe9530a5c27ee768fef9cc6cd388242ac41383b2 Oct 03 14:24:59 crc kubenswrapper[4962]: I1003 14:24:59.655189 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-np2k7"] Oct 03 14:24:59 crc kubenswrapper[4962]: W1003 14:24:59.664528 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d8a1c23_fc60_4eb3_b9a9_289ddf603d1a.slice/crio-40a9f67956718a2b6a42a165da545f13b3b0378efbd0eb77978c8f0f0c79ce3b WatchSource:0}: Error finding container 40a9f67956718a2b6a42a165da545f13b3b0378efbd0eb77978c8f0f0c79ce3b: Status 404 returned error can't find the container with id 40a9f67956718a2b6a42a165da545f13b3b0378efbd0eb77978c8f0f0c79ce3b Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.116729 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c967c3d-1f67-42ff-9849-dcd585648bb8","Type":"ContainerStarted","Data":"9b7d13c58dc06972aa0112710d144a8e1c55f80e3094ec53d1dc34ed57b57252"} Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.117186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c967c3d-1f67-42ff-9849-dcd585648bb8","Type":"ContainerStarted","Data":"5fd722d43f15ab83b3a8c49abe9530a5c27ee768fef9cc6cd388242ac41383b2"} Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.120228 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerID="271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1" exitCode=0 Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.120475 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" containerName="glance-log" containerID="cri-o://f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55" gracePeriod=30 Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.121237 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-np2k7" event={"ID":"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a","Type":"ContainerDied","Data":"271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1"} Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.121336 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-np2k7" event={"ID":"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a","Type":"ContainerStarted","Data":"40a9f67956718a2b6a42a165da545f13b3b0378efbd0eb77978c8f0f0c79ce3b"} Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.121254 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" containerName="glance-httpd" containerID="cri-o://23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104" gracePeriod=30 Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.251404 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaeeaea5-32cf-42bb-97cf-10c8fea2e026" path="/var/lib/kubelet/pods/eaeeaea5-32cf-42bb-97cf-10c8fea2e026/volumes" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.755470 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.856594 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-ceph\") pod \"a1beb556-4304-426c-abe8-0cdd5982362f\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.856681 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-httpd-run\") pod \"a1beb556-4304-426c-abe8-0cdd5982362f\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.856710 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-combined-ca-bundle\") pod \"a1beb556-4304-426c-abe8-0cdd5982362f\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.856772 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wws2q\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-kube-api-access-wws2q\") pod \"a1beb556-4304-426c-abe8-0cdd5982362f\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.856795 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-logs\") pod \"a1beb556-4304-426c-abe8-0cdd5982362f\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.856816 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-scripts\") pod \"a1beb556-4304-426c-abe8-0cdd5982362f\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.856897 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-config-data\") pod \"a1beb556-4304-426c-abe8-0cdd5982362f\" (UID: \"a1beb556-4304-426c-abe8-0cdd5982362f\") " Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.857274 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-logs" (OuterVolumeSpecName: "logs") pod "a1beb556-4304-426c-abe8-0cdd5982362f" (UID: "a1beb556-4304-426c-abe8-0cdd5982362f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.857337 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a1beb556-4304-426c-abe8-0cdd5982362f" (UID: "a1beb556-4304-426c-abe8-0cdd5982362f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.860668 4962 scope.go:117] "RemoveContainer" containerID="aa293120959a35c6da56ae0dd6f9cf379fde8723b4c11f771d10e57cb7ef494f" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.862777 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-scripts" (OuterVolumeSpecName: "scripts") pod "a1beb556-4304-426c-abe8-0cdd5982362f" (UID: "a1beb556-4304-426c-abe8-0cdd5982362f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.869817 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-ceph" (OuterVolumeSpecName: "ceph") pod "a1beb556-4304-426c-abe8-0cdd5982362f" (UID: "a1beb556-4304-426c-abe8-0cdd5982362f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.871996 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-kube-api-access-wws2q" (OuterVolumeSpecName: "kube-api-access-wws2q") pod "a1beb556-4304-426c-abe8-0cdd5982362f" (UID: "a1beb556-4304-426c-abe8-0cdd5982362f"). InnerVolumeSpecName "kube-api-access-wws2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.893970 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1beb556-4304-426c-abe8-0cdd5982362f" (UID: "a1beb556-4304-426c-abe8-0cdd5982362f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.931769 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-config-data" (OuterVolumeSpecName: "config-data") pod "a1beb556-4304-426c-abe8-0cdd5982362f" (UID: "a1beb556-4304-426c-abe8-0cdd5982362f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.961101 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.961176 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.961188 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.961202 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.961219 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wws2q\" (UniqueName: \"kubernetes.io/projected/a1beb556-4304-426c-abe8-0cdd5982362f-kube-api-access-wws2q\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.961228 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1beb556-4304-426c-abe8-0cdd5982362f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:00 crc kubenswrapper[4962]: I1003 14:25:00.961238 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1beb556-4304-426c-abe8-0cdd5982362f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.134309 4962 generic.go:334] "Generic (PLEG): container finished" podID="a1beb556-4304-426c-abe8-0cdd5982362f" containerID="23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104" exitCode=0 Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.134374 4962 generic.go:334] "Generic (PLEG): container finished" podID="a1beb556-4304-426c-abe8-0cdd5982362f" containerID="f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55" exitCode=143 Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.134419 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1beb556-4304-426c-abe8-0cdd5982362f","Type":"ContainerDied","Data":"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104"} Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.134467 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1beb556-4304-426c-abe8-0cdd5982362f","Type":"ContainerDied","Data":"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55"} Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.134481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1beb556-4304-426c-abe8-0cdd5982362f","Type":"ContainerDied","Data":"3c161dc7ad6d7b17e131e7b471b11ed20286f0eee926e3e8d171883e19a21873"} Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.134502 4962 scope.go:117] "RemoveContainer" containerID="23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.136261 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.137891 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c967c3d-1f67-42ff-9849-dcd585648bb8","Type":"ContainerStarted","Data":"c9f09c18f69924294689c67ad8cd1e601b8e4deb075871b299c705dc3c27e121"} Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.141044 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-np2k7" event={"ID":"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a","Type":"ContainerStarted","Data":"4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9"} Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.159986 4962 scope.go:117] "RemoveContainer" containerID="f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.174458 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.174434241 podStartE2EDuration="3.174434241s" podCreationTimestamp="2025-10-03 14:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:25:01.161122843 +0000 UTC m=+5709.565020678" watchObservedRunningTime="2025-10-03 14:25:01.174434241 +0000 UTC m=+5709.578332076" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.233702 4962 scope.go:117] "RemoveContainer" containerID="23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104" Oct 03 14:25:01 crc kubenswrapper[4962]: E1003 14:25:01.235651 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104\": container with ID starting with 23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104 not found: ID does not exist" containerID="23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.235689 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104"} err="failed to get container status \"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104\": rpc error: code = NotFound desc = could not find container \"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104\": container with ID starting with 23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104 not found: ID does not exist" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.235718 4962 scope.go:117] "RemoveContainer" containerID="f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55" Oct 03 14:25:01 crc kubenswrapper[4962]: E1003 14:25:01.236139 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55\": container with ID starting with f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55 not found: ID does not exist" containerID="f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.236208 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55"} err="failed to get container status \"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55\": rpc error: code = NotFound desc = could not find container \"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55\": container with ID starting with f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55 not found: ID does not exist" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.236262 4962 scope.go:117] "RemoveContainer" containerID="23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.236695 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104"} err="failed to get container status \"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104\": rpc error: code = NotFound desc = could not find container \"23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104\": container with ID starting with 23add12d374e5cf2860812eac82e2a570d60305480138f54c600c570c3656104 not found: ID does not exist" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.236736 4962 scope.go:117] "RemoveContainer" containerID="f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.236968 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55"} err="failed to get container status \"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55\": rpc error: code = NotFound desc = could not find container \"f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55\": container with ID starting with f8d6bf42039f7179859595429b1ac8f225e58a769696a64a061b157507a0cd55 not found: ID does not exist" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.238906 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.250519 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.259842 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:25:01 crc kubenswrapper[4962]: E1003 14:25:01.260234 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" containerName="glance-httpd" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.260255 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" containerName="glance-httpd" Oct 03 14:25:01 crc kubenswrapper[4962]: E1003 14:25:01.260310 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" containerName="glance-log" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.260319 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" containerName="glance-log" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.260509 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" containerName="glance-log" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.260545 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" containerName="glance-httpd" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.261872 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.266671 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.278611 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.370227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.370325 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.370551 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.370817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.370853 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.371090 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.371430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvxj2\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-kube-api-access-lvxj2\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.472947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.473064 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.473116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.473157 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.473179 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.473212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.473270 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvxj2\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-kube-api-access-lvxj2\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.473802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.473857 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.477752 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.478198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.478398 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.478794 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.495205 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvxj2\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-kube-api-access-lvxj2\") pod \"glance-default-internal-api-0\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:25:01 crc kubenswrapper[4962]: I1003 14:25:01.579059 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:02 crc kubenswrapper[4962]: I1003 14:25:02.106527 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:25:02 crc kubenswrapper[4962]: I1003 14:25:02.153575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab453a1b-a691-409f-8826-ef5286ea8efc","Type":"ContainerStarted","Data":"7d1e0c4aedb956cc03f296b66d1f870ddfc8e8e039a92729d225484c5b348c64"} Oct 03 14:25:02 crc kubenswrapper[4962]: I1003 14:25:02.158313 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerID="4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9" exitCode=0 Oct 03 14:25:02 crc kubenswrapper[4962]: I1003 14:25:02.158409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-np2k7" event={"ID":"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a","Type":"ContainerDied","Data":"4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9"} Oct 03 14:25:02 crc kubenswrapper[4962]: I1003 14:25:02.239513 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1beb556-4304-426c-abe8-0cdd5982362f" path="/var/lib/kubelet/pods/a1beb556-4304-426c-abe8-0cdd5982362f/volumes" Oct 03 14:25:03 crc kubenswrapper[4962]: I1003 14:25:03.167493 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab453a1b-a691-409f-8826-ef5286ea8efc","Type":"ContainerStarted","Data":"32a933e95b7bb998f66c1c9e494eb984df233d3076b47477db1578faec0062d0"} Oct 03 14:25:03 crc kubenswrapper[4962]: I1003 14:25:03.168029 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab453a1b-a691-409f-8826-ef5286ea8efc","Type":"ContainerStarted","Data":"01694b5bb759b57df173cde34bcd5520ef6b98929175f03f6ea30b50526f8ddb"} Oct 03 14:25:03 crc kubenswrapper[4962]: I1003 14:25:03.170898 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-np2k7" event={"ID":"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a","Type":"ContainerStarted","Data":"8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76"} Oct 03 14:25:03 crc kubenswrapper[4962]: I1003 14:25:03.192893 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.192870967 podStartE2EDuration="2.192870967s" podCreationTimestamp="2025-10-03 14:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:25:03.187424165 +0000 UTC m=+5711.591322000" watchObservedRunningTime="2025-10-03 14:25:03.192870967 +0000 UTC m=+5711.596768802" Oct 03 14:25:03 crc kubenswrapper[4962]: I1003 14:25:03.212402 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-np2k7" podStartSLOduration=2.688377632 podStartE2EDuration="5.212383047s" podCreationTimestamp="2025-10-03 14:24:58 +0000 UTC" firstStartedPulling="2025-10-03 14:25:00.125912415 +0000 UTC m=+5708.529810250" lastFinishedPulling="2025-10-03 14:25:02.64991783 +0000 UTC m=+5711.053815665" observedRunningTime="2025-10-03 14:25:03.203373502 +0000 UTC m=+5711.607271347" watchObservedRunningTime="2025-10-03 14:25:03.212383047 +0000 UTC m=+5711.616280882" Oct 03 14:25:04 crc kubenswrapper[4962]: I1003 14:25:04.789843 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:25:04 crc kubenswrapper[4962]: I1003 14:25:04.864070 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-646c67b865-cs96q"] Oct 03 14:25:04 crc kubenswrapper[4962]: I1003 14:25:04.864419 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-646c67b865-cs96q" podUID="66f387f8-5749-4493-9748-a9f0bb6352c1" containerName="dnsmasq-dns" containerID="cri-o://df56b8c4b474556c28208370edde8c85271a606b06ed92f520673dce62c32042" gracePeriod=10 Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.207201 4962 generic.go:334] "Generic (PLEG): container finished" podID="66f387f8-5749-4493-9748-a9f0bb6352c1" containerID="df56b8c4b474556c28208370edde8c85271a606b06ed92f520673dce62c32042" exitCode=0 Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.207244 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646c67b865-cs96q" event={"ID":"66f387f8-5749-4493-9748-a9f0bb6352c1","Type":"ContainerDied","Data":"df56b8c4b474556c28208370edde8c85271a606b06ed92f520673dce62c32042"} Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.380143 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.449450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-nb\") pod \"66f387f8-5749-4493-9748-a9f0bb6352c1\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.449540 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx6pf\" (UniqueName: \"kubernetes.io/projected/66f387f8-5749-4493-9748-a9f0bb6352c1-kube-api-access-vx6pf\") pod \"66f387f8-5749-4493-9748-a9f0bb6352c1\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.449707 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-sb\") pod \"66f387f8-5749-4493-9748-a9f0bb6352c1\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.449740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-config\") pod \"66f387f8-5749-4493-9748-a9f0bb6352c1\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.449775 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-dns-svc\") pod \"66f387f8-5749-4493-9748-a9f0bb6352c1\" (UID: \"66f387f8-5749-4493-9748-a9f0bb6352c1\") " Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.456612 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f387f8-5749-4493-9748-a9f0bb6352c1-kube-api-access-vx6pf" (OuterVolumeSpecName: "kube-api-access-vx6pf") pod "66f387f8-5749-4493-9748-a9f0bb6352c1" (UID: "66f387f8-5749-4493-9748-a9f0bb6352c1"). InnerVolumeSpecName "kube-api-access-vx6pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.494818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66f387f8-5749-4493-9748-a9f0bb6352c1" (UID: "66f387f8-5749-4493-9748-a9f0bb6352c1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.500622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66f387f8-5749-4493-9748-a9f0bb6352c1" (UID: "66f387f8-5749-4493-9748-a9f0bb6352c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.504214 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-config" (OuterVolumeSpecName: "config") pod "66f387f8-5749-4493-9748-a9f0bb6352c1" (UID: "66f387f8-5749-4493-9748-a9f0bb6352c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.508075 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66f387f8-5749-4493-9748-a9f0bb6352c1" (UID: "66f387f8-5749-4493-9748-a9f0bb6352c1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.552677 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.552762 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx6pf\" (UniqueName: \"kubernetes.io/projected/66f387f8-5749-4493-9748-a9f0bb6352c1-kube-api-access-vx6pf\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.552778 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.552791 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:05 crc kubenswrapper[4962]: I1003 14:25:05.552801 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f387f8-5749-4493-9748-a9f0bb6352c1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:06 crc kubenswrapper[4962]: I1003 14:25:06.217007 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646c67b865-cs96q" event={"ID":"66f387f8-5749-4493-9748-a9f0bb6352c1","Type":"ContainerDied","Data":"06529b70d0ece10f5cb79fd0a25b594045ca7e9e4f7817f2a03374f5c320b6a9"} Oct 03 14:25:06 crc kubenswrapper[4962]: I1003 14:25:06.217069 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-646c67b865-cs96q" Oct 03 14:25:06 crc kubenswrapper[4962]: I1003 14:25:06.217072 4962 scope.go:117] "RemoveContainer" containerID="df56b8c4b474556c28208370edde8c85271a606b06ed92f520673dce62c32042" Oct 03 14:25:06 crc kubenswrapper[4962]: I1003 14:25:06.243131 4962 scope.go:117] "RemoveContainer" containerID="6c9eb6dbc628a16b1eacd5760adbced702cc1745bf43b5743d6a45db956a4918" Oct 03 14:25:06 crc kubenswrapper[4962]: I1003 14:25:06.258819 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-646c67b865-cs96q"] Oct 03 14:25:06 crc kubenswrapper[4962]: I1003 14:25:06.265107 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-646c67b865-cs96q"] Oct 03 14:25:08 crc kubenswrapper[4962]: I1003 14:25:08.239327 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f387f8-5749-4493-9748-a9f0bb6352c1" path="/var/lib/kubelet/pods/66f387f8-5749-4493-9748-a9f0bb6352c1/volumes" Oct 03 14:25:08 crc kubenswrapper[4962]: I1003 14:25:08.772090 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 14:25:08 crc kubenswrapper[4962]: I1003 14:25:08.772141 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 14:25:08 crc kubenswrapper[4962]: I1003 14:25:08.799070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 14:25:08 crc kubenswrapper[4962]: I1003 14:25:08.814045 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 14:25:09 crc kubenswrapper[4962]: I1003 14:25:09.086136 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:25:09 crc kubenswrapper[4962]: I1003 14:25:09.086190 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:25:09 crc kubenswrapper[4962]: I1003 14:25:09.141074 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:25:09 crc kubenswrapper[4962]: I1003 14:25:09.245067 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 14:25:09 crc kubenswrapper[4962]: I1003 14:25:09.245107 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 14:25:09 crc kubenswrapper[4962]: I1003 14:25:09.290101 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:25:09 crc kubenswrapper[4962]: I1003 14:25:09.381208 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-np2k7"] Oct 03 14:25:11 crc kubenswrapper[4962]: I1003 14:25:11.214089 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 14:25:11 crc kubenswrapper[4962]: I1003 14:25:11.251463 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 14:25:11 crc kubenswrapper[4962]: I1003 14:25:11.270615 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-np2k7" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerName="registry-server" containerID="cri-o://8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76" gracePeriod=2 Oct 03 14:25:11 crc kubenswrapper[4962]: I1003 14:25:11.580316 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:11 crc kubenswrapper[4962]: I1003 14:25:11.580374 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:11 crc kubenswrapper[4962]: I1003 14:25:11.619043 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:11 crc kubenswrapper[4962]: I1003 14:25:11.619224 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.189268 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.266976 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w68h8\" (UniqueName: \"kubernetes.io/projected/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-kube-api-access-w68h8\") pod \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.267103 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-catalog-content\") pod \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.267160 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-utilities\") pod \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\" (UID: \"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a\") " Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.268210 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-utilities" (OuterVolumeSpecName: "utilities") pod "3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" (UID: "3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.272665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-kube-api-access-w68h8" (OuterVolumeSpecName: "kube-api-access-w68h8") pod "3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" (UID: "3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a"). InnerVolumeSpecName "kube-api-access-w68h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.299370 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerID="8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76" exitCode=0 Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.299421 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-np2k7" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.299420 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-np2k7" event={"ID":"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a","Type":"ContainerDied","Data":"8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76"} Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.299487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-np2k7" event={"ID":"3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a","Type":"ContainerDied","Data":"40a9f67956718a2b6a42a165da545f13b3b0378efbd0eb77978c8f0f0c79ce3b"} Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.299505 4962 scope.go:117] "RemoveContainer" containerID="8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.299975 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.299995 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.322391 4962 scope.go:117] "RemoveContainer" containerID="4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.345984 4962 scope.go:117] "RemoveContainer" containerID="271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.369572 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w68h8\" (UniqueName: \"kubernetes.io/projected/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-kube-api-access-w68h8\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.369608 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.380237 4962 scope.go:117] "RemoveContainer" containerID="8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76" Oct 03 14:25:12 crc kubenswrapper[4962]: E1003 14:25:12.380772 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76\": container with ID starting with 8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76 not found: ID does not exist" containerID="8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.380814 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76"} err="failed to get container status \"8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76\": rpc error: code = NotFound desc = could not find container \"8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76\": container with ID starting with 8e20cef35306e1c1e10440df498986656752c8ba398aee3675829e31937e1d76 not found: ID does not exist" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.380840 4962 scope.go:117] "RemoveContainer" containerID="4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9" Oct 03 14:25:12 crc kubenswrapper[4962]: E1003 14:25:12.381236 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9\": container with ID starting with 4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9 not found: ID does not exist" containerID="4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.381283 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9"} err="failed to get container status \"4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9\": rpc error: code = NotFound desc = could not find container \"4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9\": container with ID starting with 4097cde9893631e77e986e5646dc6cd8ced78baba4c10a5fb4b01075c45468e9 not found: ID does not exist" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.381311 4962 scope.go:117] "RemoveContainer" containerID="271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1" Oct 03 14:25:12 crc kubenswrapper[4962]: E1003 14:25:12.381611 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1\": container with ID starting with 271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1 not found: ID does not exist" containerID="271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.381659 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1"} err="failed to get container status \"271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1\": rpc error: code = NotFound desc = could not find container \"271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1\": container with ID starting with 271c584d5c1319fa999b3aab62f12f653174006fa309a1d3edabafc1c3451eb1 not found: ID does not exist" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.655030 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" (UID: "3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.674021 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.931137 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-np2k7"] Oct 03 14:25:12 crc kubenswrapper[4962]: I1003 14:25:12.938762 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-np2k7"] Oct 03 14:25:14 crc kubenswrapper[4962]: I1003 14:25:14.238366 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" path="/var/lib/kubelet/pods/3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a/volumes" Oct 03 14:25:14 crc kubenswrapper[4962]: I1003 14:25:14.434933 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:14 crc kubenswrapper[4962]: I1003 14:25:14.435342 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:25:14 crc kubenswrapper[4962]: I1003 14:25:14.437256 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.571741 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-mlb24"] Oct 03 14:25:20 crc kubenswrapper[4962]: E1003 14:25:20.572504 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerName="registry-server" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.572523 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerName="registry-server" Oct 03 14:25:20 crc kubenswrapper[4962]: E1003 14:25:20.572554 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f387f8-5749-4493-9748-a9f0bb6352c1" containerName="dnsmasq-dns" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.572562 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f387f8-5749-4493-9748-a9f0bb6352c1" containerName="dnsmasq-dns" Oct 03 14:25:20 crc kubenswrapper[4962]: E1003 14:25:20.572573 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerName="extract-utilities" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.572582 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerName="extract-utilities" Oct 03 14:25:20 crc kubenswrapper[4962]: E1003 14:25:20.572606 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f387f8-5749-4493-9748-a9f0bb6352c1" containerName="init" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.572615 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f387f8-5749-4493-9748-a9f0bb6352c1" containerName="init" Oct 03 14:25:20 crc kubenswrapper[4962]: E1003 14:25:20.572631 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerName="extract-content" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.572655 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerName="extract-content" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.572853 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f387f8-5749-4493-9748-a9f0bb6352c1" containerName="dnsmasq-dns" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.572883 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d8a1c23-fc60-4eb3-b9a9-289ddf603d1a" containerName="registry-server" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.573814 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mlb24" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.587293 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mlb24"] Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.703829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lzc\" (UniqueName: \"kubernetes.io/projected/17e3b8ae-be38-4566-914b-dfc9776e2218-kube-api-access-48lzc\") pod \"placement-db-create-mlb24\" (UID: \"17e3b8ae-be38-4566-914b-dfc9776e2218\") " pod="openstack/placement-db-create-mlb24" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.805451 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lzc\" (UniqueName: \"kubernetes.io/projected/17e3b8ae-be38-4566-914b-dfc9776e2218-kube-api-access-48lzc\") pod \"placement-db-create-mlb24\" (UID: \"17e3b8ae-be38-4566-914b-dfc9776e2218\") " pod="openstack/placement-db-create-mlb24" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.830323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lzc\" (UniqueName: \"kubernetes.io/projected/17e3b8ae-be38-4566-914b-dfc9776e2218-kube-api-access-48lzc\") pod \"placement-db-create-mlb24\" (UID: \"17e3b8ae-be38-4566-914b-dfc9776e2218\") " pod="openstack/placement-db-create-mlb24" Oct 03 14:25:20 crc kubenswrapper[4962]: I1003 14:25:20.895497 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mlb24" Oct 03 14:25:21 crc kubenswrapper[4962]: I1003 14:25:21.368116 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mlb24"] Oct 03 14:25:21 crc kubenswrapper[4962]: W1003 14:25:21.379402 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17e3b8ae_be38_4566_914b_dfc9776e2218.slice/crio-9ee2747e25942968cc6dc7698eda5b8411d05b46d50a0f51ad59e548f2e0f457 WatchSource:0}: Error finding container 9ee2747e25942968cc6dc7698eda5b8411d05b46d50a0f51ad59e548f2e0f457: Status 404 returned error can't find the container with id 9ee2747e25942968cc6dc7698eda5b8411d05b46d50a0f51ad59e548f2e0f457 Oct 03 14:25:22 crc kubenswrapper[4962]: I1003 14:25:22.382741 4962 generic.go:334] "Generic (PLEG): container finished" podID="17e3b8ae-be38-4566-914b-dfc9776e2218" containerID="a329031bb0d35a598b4a8b1d68885bd88e39d01f60a838603e6c6659120ac3b2" exitCode=0 Oct 03 14:25:22 crc kubenswrapper[4962]: I1003 14:25:22.382878 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mlb24" event={"ID":"17e3b8ae-be38-4566-914b-dfc9776e2218","Type":"ContainerDied","Data":"a329031bb0d35a598b4a8b1d68885bd88e39d01f60a838603e6c6659120ac3b2"} Oct 03 14:25:22 crc kubenswrapper[4962]: I1003 14:25:22.383080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mlb24" event={"ID":"17e3b8ae-be38-4566-914b-dfc9776e2218","Type":"ContainerStarted","Data":"9ee2747e25942968cc6dc7698eda5b8411d05b46d50a0f51ad59e548f2e0f457"} Oct 03 14:25:23 crc kubenswrapper[4962]: I1003 14:25:23.753581 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mlb24" Oct 03 14:25:23 crc kubenswrapper[4962]: I1003 14:25:23.856416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48lzc\" (UniqueName: \"kubernetes.io/projected/17e3b8ae-be38-4566-914b-dfc9776e2218-kube-api-access-48lzc\") pod \"17e3b8ae-be38-4566-914b-dfc9776e2218\" (UID: \"17e3b8ae-be38-4566-914b-dfc9776e2218\") " Oct 03 14:25:23 crc kubenswrapper[4962]: I1003 14:25:23.868074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e3b8ae-be38-4566-914b-dfc9776e2218-kube-api-access-48lzc" (OuterVolumeSpecName: "kube-api-access-48lzc") pod "17e3b8ae-be38-4566-914b-dfc9776e2218" (UID: "17e3b8ae-be38-4566-914b-dfc9776e2218"). InnerVolumeSpecName "kube-api-access-48lzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:23 crc kubenswrapper[4962]: I1003 14:25:23.957984 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48lzc\" (UniqueName: \"kubernetes.io/projected/17e3b8ae-be38-4566-914b-dfc9776e2218-kube-api-access-48lzc\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:24 crc kubenswrapper[4962]: I1003 14:25:24.400617 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mlb24" Oct 03 14:25:24 crc kubenswrapper[4962]: I1003 14:25:24.400622 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mlb24" event={"ID":"17e3b8ae-be38-4566-914b-dfc9776e2218","Type":"ContainerDied","Data":"9ee2747e25942968cc6dc7698eda5b8411d05b46d50a0f51ad59e548f2e0f457"} Oct 03 14:25:24 crc kubenswrapper[4962]: I1003 14:25:24.400713 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ee2747e25942968cc6dc7698eda5b8411d05b46d50a0f51ad59e548f2e0f457" Oct 03 14:25:24 crc kubenswrapper[4962]: I1003 14:25:24.659768 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:25:24 crc kubenswrapper[4962]: I1003 14:25:24.660199 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.661758 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ec53-account-create-q5vph"] Oct 03 14:25:30 crc kubenswrapper[4962]: E1003 14:25:30.662385 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e3b8ae-be38-4566-914b-dfc9776e2218" containerName="mariadb-database-create" Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.662396 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e3b8ae-be38-4566-914b-dfc9776e2218" containerName="mariadb-database-create" Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.662563 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e3b8ae-be38-4566-914b-dfc9776e2218" containerName="mariadb-database-create" Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.663210 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ec53-account-create-q5vph" Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.664917 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.674241 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ec53-account-create-q5vph"] Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.779891 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzs9\" (UniqueName: \"kubernetes.io/projected/8ec444c0-1410-4946-908d-60384d2b766f-kube-api-access-7vzs9\") pod \"placement-ec53-account-create-q5vph\" (UID: \"8ec444c0-1410-4946-908d-60384d2b766f\") " pod="openstack/placement-ec53-account-create-q5vph" Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.881241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzs9\" (UniqueName: \"kubernetes.io/projected/8ec444c0-1410-4946-908d-60384d2b766f-kube-api-access-7vzs9\") pod \"placement-ec53-account-create-q5vph\" (UID: \"8ec444c0-1410-4946-908d-60384d2b766f\") " pod="openstack/placement-ec53-account-create-q5vph" Oct 03 14:25:30 crc kubenswrapper[4962]: I1003 14:25:30.901758 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzs9\" (UniqueName: \"kubernetes.io/projected/8ec444c0-1410-4946-908d-60384d2b766f-kube-api-access-7vzs9\") pod \"placement-ec53-account-create-q5vph\" (UID: \"8ec444c0-1410-4946-908d-60384d2b766f\") " pod="openstack/placement-ec53-account-create-q5vph" Oct 03 14:25:31 crc kubenswrapper[4962]: I1003 14:25:31.031880 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ec53-account-create-q5vph" Oct 03 14:25:31 crc kubenswrapper[4962]: I1003 14:25:31.451595 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ec53-account-create-q5vph"] Oct 03 14:25:32 crc kubenswrapper[4962]: I1003 14:25:32.466950 4962 generic.go:334] "Generic (PLEG): container finished" podID="8ec444c0-1410-4946-908d-60384d2b766f" containerID="99801711faa9ecf8a74d9a4a5e1a95dda8c79c466fffc9d77f533103e27747ad" exitCode=0 Oct 03 14:25:32 crc kubenswrapper[4962]: I1003 14:25:32.466989 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ec53-account-create-q5vph" event={"ID":"8ec444c0-1410-4946-908d-60384d2b766f","Type":"ContainerDied","Data":"99801711faa9ecf8a74d9a4a5e1a95dda8c79c466fffc9d77f533103e27747ad"} Oct 03 14:25:32 crc kubenswrapper[4962]: I1003 14:25:32.467013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ec53-account-create-q5vph" event={"ID":"8ec444c0-1410-4946-908d-60384d2b766f","Type":"ContainerStarted","Data":"3f65ec71f62b19c945042caeaaa526ca90c6ba4996c4d52bc0d15be6da854db4"} Oct 03 14:25:33 crc kubenswrapper[4962]: I1003 14:25:33.787967 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ec53-account-create-q5vph" Oct 03 14:25:33 crc kubenswrapper[4962]: I1003 14:25:33.837556 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vzs9\" (UniqueName: \"kubernetes.io/projected/8ec444c0-1410-4946-908d-60384d2b766f-kube-api-access-7vzs9\") pod \"8ec444c0-1410-4946-908d-60384d2b766f\" (UID: \"8ec444c0-1410-4946-908d-60384d2b766f\") " Oct 03 14:25:33 crc kubenswrapper[4962]: I1003 14:25:33.844514 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec444c0-1410-4946-908d-60384d2b766f-kube-api-access-7vzs9" (OuterVolumeSpecName: "kube-api-access-7vzs9") pod "8ec444c0-1410-4946-908d-60384d2b766f" (UID: "8ec444c0-1410-4946-908d-60384d2b766f"). InnerVolumeSpecName "kube-api-access-7vzs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:33 crc kubenswrapper[4962]: I1003 14:25:33.940439 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vzs9\" (UniqueName: \"kubernetes.io/projected/8ec444c0-1410-4946-908d-60384d2b766f-kube-api-access-7vzs9\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:34 crc kubenswrapper[4962]: I1003 14:25:34.486214 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ec53-account-create-q5vph" Oct 03 14:25:34 crc kubenswrapper[4962]: I1003 14:25:34.486709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ec53-account-create-q5vph" event={"ID":"8ec444c0-1410-4946-908d-60384d2b766f","Type":"ContainerDied","Data":"3f65ec71f62b19c945042caeaaa526ca90c6ba4996c4d52bc0d15be6da854db4"} Oct 03 14:25:34 crc kubenswrapper[4962]: I1003 14:25:34.486889 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f65ec71f62b19c945042caeaaa526ca90c6ba4996c4d52bc0d15be6da854db4" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.912754 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57665d4b55-knl2t"] Oct 03 14:25:35 crc kubenswrapper[4962]: E1003 14:25:35.913458 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec444c0-1410-4946-908d-60384d2b766f" containerName="mariadb-account-create" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.913477 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec444c0-1410-4946-908d-60384d2b766f" containerName="mariadb-account-create" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.920100 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec444c0-1410-4946-908d-60384d2b766f" containerName="mariadb-account-create" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.921151 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.924670 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57665d4b55-knl2t"] Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.939561 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2fjvl"] Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.941059 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.943312 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.943324 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-whb5g" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.944092 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.951876 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2fjvl"] Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.976003 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-config\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.977660 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-dns-svc\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.977784 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-nb\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:35 crc kubenswrapper[4962]: I1003 14:25:35.977835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-sb\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080177 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-dns-svc\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgdj\" (UniqueName: \"kubernetes.io/projected/b2d090d8-3ccd-4f67-91f5-8737048283dc-kube-api-access-lmgdj\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-nb\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080411 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-sb\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-scripts\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080495 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-config-data\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080524 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d090d8-3ccd-4f67-91f5-8737048283dc-logs\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080617 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-combined-ca-bundle\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080847 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5k7m\" (UniqueName: \"kubernetes.io/projected/766f241c-412b-470d-938c-8785be7fe7ab-kube-api-access-j5k7m\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.080983 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-config\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.081535 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-dns-svc\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.081903 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-nb\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.081914 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-config\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.082000 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-sb\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.182136 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-scripts\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.182257 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-config-data\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.182292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d090d8-3ccd-4f67-91f5-8737048283dc-logs\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.182311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-combined-ca-bundle\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.182344 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5k7m\" (UniqueName: \"kubernetes.io/projected/766f241c-412b-470d-938c-8785be7fe7ab-kube-api-access-j5k7m\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.182415 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgdj\" (UniqueName: \"kubernetes.io/projected/b2d090d8-3ccd-4f67-91f5-8737048283dc-kube-api-access-lmgdj\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.182825 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d090d8-3ccd-4f67-91f5-8737048283dc-logs\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.188236 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-combined-ca-bundle\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.188782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-scripts\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.188920 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-config-data\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.210213 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5k7m\" (UniqueName: \"kubernetes.io/projected/766f241c-412b-470d-938c-8785be7fe7ab-kube-api-access-j5k7m\") pod \"dnsmasq-dns-57665d4b55-knl2t\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.210974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgdj\" (UniqueName: \"kubernetes.io/projected/b2d090d8-3ccd-4f67-91f5-8737048283dc-kube-api-access-lmgdj\") pod \"placement-db-sync-2fjvl\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.259543 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.271610 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.665805 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2fjvl"] Oct 03 14:25:36 crc kubenswrapper[4962]: W1003 14:25:36.944559 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod766f241c_412b_470d_938c_8785be7fe7ab.slice/crio-c12f1043c918273a5ce735840ce268977e37a40051b5bebeb7e09d04413c6aa0 WatchSource:0}: Error finding container c12f1043c918273a5ce735840ce268977e37a40051b5bebeb7e09d04413c6aa0: Status 404 returned error can't find the container with id c12f1043c918273a5ce735840ce268977e37a40051b5bebeb7e09d04413c6aa0 Oct 03 14:25:36 crc kubenswrapper[4962]: I1003 14:25:36.949782 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57665d4b55-knl2t"] Oct 03 14:25:37 crc kubenswrapper[4962]: I1003 14:25:37.538617 4962 generic.go:334] "Generic (PLEG): container finished" podID="766f241c-412b-470d-938c-8785be7fe7ab" containerID="c46c4ea63ed49dcc7e3c6542f0c95e81f46d070561af577146f41f5fd7bc7141" exitCode=0 Oct 03 14:25:37 crc kubenswrapper[4962]: I1003 14:25:37.538685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" event={"ID":"766f241c-412b-470d-938c-8785be7fe7ab","Type":"ContainerDied","Data":"c46c4ea63ed49dcc7e3c6542f0c95e81f46d070561af577146f41f5fd7bc7141"} Oct 03 14:25:37 crc kubenswrapper[4962]: I1003 14:25:37.538995 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" event={"ID":"766f241c-412b-470d-938c-8785be7fe7ab","Type":"ContainerStarted","Data":"c12f1043c918273a5ce735840ce268977e37a40051b5bebeb7e09d04413c6aa0"} Oct 03 14:25:37 crc kubenswrapper[4962]: I1003 14:25:37.543218 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2fjvl" event={"ID":"b2d090d8-3ccd-4f67-91f5-8737048283dc","Type":"ContainerStarted","Data":"d31bf978cdfb5b39c756569726205c7d17011ee2009ce1800b0242ebce14a3fb"} Oct 03 14:25:37 crc kubenswrapper[4962]: I1003 14:25:37.543243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2fjvl" event={"ID":"b2d090d8-3ccd-4f67-91f5-8737048283dc","Type":"ContainerStarted","Data":"6e1ccdd140430f164c20e6ab69e173a78818bc55964f09b42d23fc830e4e72e1"} Oct 03 14:25:37 crc kubenswrapper[4962]: I1003 14:25:37.586352 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2fjvl" podStartSLOduration=2.586335068 podStartE2EDuration="2.586335068s" podCreationTimestamp="2025-10-03 14:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:25:37.576168742 +0000 UTC m=+5745.980066577" watchObservedRunningTime="2025-10-03 14:25:37.586335068 +0000 UTC m=+5745.990232893" Oct 03 14:25:38 crc kubenswrapper[4962]: I1003 14:25:38.554506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" event={"ID":"766f241c-412b-470d-938c-8785be7fe7ab","Type":"ContainerStarted","Data":"c49cbfe3abd20040f4005810cb1e8438c9700d23f46975313c122b4ae32695f1"} Oct 03 14:25:38 crc kubenswrapper[4962]: I1003 14:25:38.578240 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" podStartSLOduration=3.578220122 podStartE2EDuration="3.578220122s" podCreationTimestamp="2025-10-03 14:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:25:38.570930212 +0000 UTC m=+5746.974828057" watchObservedRunningTime="2025-10-03 14:25:38.578220122 +0000 UTC m=+5746.982117967" Oct 03 14:25:39 crc kubenswrapper[4962]: I1003 14:25:39.565172 4962 generic.go:334] "Generic (PLEG): container finished" podID="b2d090d8-3ccd-4f67-91f5-8737048283dc" containerID="d31bf978cdfb5b39c756569726205c7d17011ee2009ce1800b0242ebce14a3fb" exitCode=0 Oct 03 14:25:39 crc kubenswrapper[4962]: I1003 14:25:39.565754 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2fjvl" event={"ID":"b2d090d8-3ccd-4f67-91f5-8737048283dc","Type":"ContainerDied","Data":"d31bf978cdfb5b39c756569726205c7d17011ee2009ce1800b0242ebce14a3fb"} Oct 03 14:25:39 crc kubenswrapper[4962]: I1003 14:25:39.565862 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.867370 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.979038 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmgdj\" (UniqueName: \"kubernetes.io/projected/b2d090d8-3ccd-4f67-91f5-8737048283dc-kube-api-access-lmgdj\") pod \"b2d090d8-3ccd-4f67-91f5-8737048283dc\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.979107 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-combined-ca-bundle\") pod \"b2d090d8-3ccd-4f67-91f5-8737048283dc\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.979183 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d090d8-3ccd-4f67-91f5-8737048283dc-logs\") pod \"b2d090d8-3ccd-4f67-91f5-8737048283dc\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.979329 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-config-data\") pod \"b2d090d8-3ccd-4f67-91f5-8737048283dc\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.979383 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-scripts\") pod \"b2d090d8-3ccd-4f67-91f5-8737048283dc\" (UID: \"b2d090d8-3ccd-4f67-91f5-8737048283dc\") " Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.979961 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2d090d8-3ccd-4f67-91f5-8737048283dc-logs" (OuterVolumeSpecName: "logs") pod "b2d090d8-3ccd-4f67-91f5-8737048283dc" (UID: "b2d090d8-3ccd-4f67-91f5-8737048283dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.980527 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d090d8-3ccd-4f67-91f5-8737048283dc-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.984762 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-scripts" (OuterVolumeSpecName: "scripts") pod "b2d090d8-3ccd-4f67-91f5-8737048283dc" (UID: "b2d090d8-3ccd-4f67-91f5-8737048283dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:40 crc kubenswrapper[4962]: I1003 14:25:40.985047 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d090d8-3ccd-4f67-91f5-8737048283dc-kube-api-access-lmgdj" (OuterVolumeSpecName: "kube-api-access-lmgdj") pod "b2d090d8-3ccd-4f67-91f5-8737048283dc" (UID: "b2d090d8-3ccd-4f67-91f5-8737048283dc"). InnerVolumeSpecName "kube-api-access-lmgdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.007452 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2d090d8-3ccd-4f67-91f5-8737048283dc" (UID: "b2d090d8-3ccd-4f67-91f5-8737048283dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.019880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-config-data" (OuterVolumeSpecName: "config-data") pod "b2d090d8-3ccd-4f67-91f5-8737048283dc" (UID: "b2d090d8-3ccd-4f67-91f5-8737048283dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.082090 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmgdj\" (UniqueName: \"kubernetes.io/projected/b2d090d8-3ccd-4f67-91f5-8737048283dc-kube-api-access-lmgdj\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.082129 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.082141 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.082150 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d090d8-3ccd-4f67-91f5-8737048283dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.580750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2fjvl" event={"ID":"b2d090d8-3ccd-4f67-91f5-8737048283dc","Type":"ContainerDied","Data":"6e1ccdd140430f164c20e6ab69e173a78818bc55964f09b42d23fc830e4e72e1"} Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.580799 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e1ccdd140430f164c20e6ab69e173a78818bc55964f09b42d23fc830e4e72e1" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.580820 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2fjvl" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.959416 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-58c9cc65fb-rvbds"] Oct 03 14:25:41 crc kubenswrapper[4962]: E1003 14:25:41.959793 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d090d8-3ccd-4f67-91f5-8737048283dc" containerName="placement-db-sync" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.959807 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d090d8-3ccd-4f67-91f5-8737048283dc" containerName="placement-db-sync" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.960078 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d090d8-3ccd-4f67-91f5-8737048283dc" containerName="placement-db-sync" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.961164 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.965415 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.966053 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.967809 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-whb5g" Oct 03 14:25:41 crc kubenswrapper[4962]: I1003 14:25:41.974047 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58c9cc65fb-rvbds"] Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.101479 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-scripts\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.101892 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c9b090e-aaa5-4d35-a951-6f1628b6c018-logs\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.102040 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-config-data\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.102173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-combined-ca-bundle\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.102281 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s46gd\" (UniqueName: \"kubernetes.io/projected/6c9b090e-aaa5-4d35-a951-6f1628b6c018-kube-api-access-s46gd\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.203356 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-scripts\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.203464 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c9b090e-aaa5-4d35-a951-6f1628b6c018-logs\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.203497 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-config-data\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.203520 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-combined-ca-bundle\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.203547 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s46gd\" (UniqueName: \"kubernetes.io/projected/6c9b090e-aaa5-4d35-a951-6f1628b6c018-kube-api-access-s46gd\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.203971 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c9b090e-aaa5-4d35-a951-6f1628b6c018-logs\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.207520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-combined-ca-bundle\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.208064 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-scripts\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.209103 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9b090e-aaa5-4d35-a951-6f1628b6c018-config-data\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.225850 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s46gd\" (UniqueName: \"kubernetes.io/projected/6c9b090e-aaa5-4d35-a951-6f1628b6c018-kube-api-access-s46gd\") pod \"placement-58c9cc65fb-rvbds\" (UID: \"6c9b090e-aaa5-4d35-a951-6f1628b6c018\") " pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.276550 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:42 crc kubenswrapper[4962]: I1003 14:25:42.756056 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58c9cc65fb-rvbds"] Oct 03 14:25:43 crc kubenswrapper[4962]: I1003 14:25:43.597263 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c9cc65fb-rvbds" event={"ID":"6c9b090e-aaa5-4d35-a951-6f1628b6c018","Type":"ContainerStarted","Data":"e6f909b1c850e3dd20468e6e2bc40bdd2f2622a67ccbd5a9555896787d89a502"} Oct 03 14:25:43 crc kubenswrapper[4962]: I1003 14:25:43.597655 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c9cc65fb-rvbds" event={"ID":"6c9b090e-aaa5-4d35-a951-6f1628b6c018","Type":"ContainerStarted","Data":"c2f4b0ab28650ebbf0da4edbb258e360b635d607aed5daf3fd1cc4a10f5faf08"} Oct 03 14:25:43 crc kubenswrapper[4962]: I1003 14:25:43.597679 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:43 crc kubenswrapper[4962]: I1003 14:25:43.597693 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c9cc65fb-rvbds" event={"ID":"6c9b090e-aaa5-4d35-a951-6f1628b6c018","Type":"ContainerStarted","Data":"744a74c31ee2cbf8f38de3ce8a6ddc975d10f8b229baed448c3fceecf0ffcfb8"} Oct 03 14:25:43 crc kubenswrapper[4962]: I1003 14:25:43.597710 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:25:43 crc kubenswrapper[4962]: I1003 14:25:43.616850 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-58c9cc65fb-rvbds" podStartSLOduration=2.6168337189999997 podStartE2EDuration="2.616833719s" podCreationTimestamp="2025-10-03 14:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:25:43.612222388 +0000 UTC m=+5752.016120223" watchObservedRunningTime="2025-10-03 14:25:43.616833719 +0000 UTC m=+5752.020731554" Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.262291 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.322051 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f5596d8dc-rzq76"] Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.322329 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" podUID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" containerName="dnsmasq-dns" containerID="cri-o://386b0e4df84705e33b62dcd6c26a4882c64afa61356cf7ec04674a1990d9aa7d" gracePeriod=10 Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.633363 4962 generic.go:334] "Generic (PLEG): container finished" podID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" containerID="386b0e4df84705e33b62dcd6c26a4882c64afa61356cf7ec04674a1990d9aa7d" exitCode=0 Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.633407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" event={"ID":"7034962f-7ee1-44f1-8dba-bae0bdb8911d","Type":"ContainerDied","Data":"386b0e4df84705e33b62dcd6c26a4882c64afa61356cf7ec04674a1990d9aa7d"} Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.783111 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.899195 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-sb\") pod \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.899258 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-nb\") pod \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.899295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-config\") pod \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.899400 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-dns-svc\") pod \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.899502 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmg4d\" (UniqueName: \"kubernetes.io/projected/7034962f-7ee1-44f1-8dba-bae0bdb8911d-kube-api-access-bmg4d\") pod \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\" (UID: \"7034962f-7ee1-44f1-8dba-bae0bdb8911d\") " Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.917578 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7034962f-7ee1-44f1-8dba-bae0bdb8911d-kube-api-access-bmg4d" (OuterVolumeSpecName: "kube-api-access-bmg4d") pod "7034962f-7ee1-44f1-8dba-bae0bdb8911d" (UID: "7034962f-7ee1-44f1-8dba-bae0bdb8911d"). InnerVolumeSpecName "kube-api-access-bmg4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.945509 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-config" (OuterVolumeSpecName: "config") pod "7034962f-7ee1-44f1-8dba-bae0bdb8911d" (UID: "7034962f-7ee1-44f1-8dba-bae0bdb8911d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.947254 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7034962f-7ee1-44f1-8dba-bae0bdb8911d" (UID: "7034962f-7ee1-44f1-8dba-bae0bdb8911d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.958053 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7034962f-7ee1-44f1-8dba-bae0bdb8911d" (UID: "7034962f-7ee1-44f1-8dba-bae0bdb8911d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:25:46 crc kubenswrapper[4962]: I1003 14:25:46.961335 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7034962f-7ee1-44f1-8dba-bae0bdb8911d" (UID: "7034962f-7ee1-44f1-8dba-bae0bdb8911d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.002387 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.003771 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.003800 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.003814 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmg4d\" (UniqueName: \"kubernetes.io/projected/7034962f-7ee1-44f1-8dba-bae0bdb8911d-kube-api-access-bmg4d\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.003828 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7034962f-7ee1-44f1-8dba-bae0bdb8911d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.644664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" event={"ID":"7034962f-7ee1-44f1-8dba-bae0bdb8911d","Type":"ContainerDied","Data":"4b96b8ccd5a25f0e6f222d354d4f49e0d79e29ea0a9c61aa76688a3ccfc1ed13"} Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.644717 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5596d8dc-rzq76" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.644729 4962 scope.go:117] "RemoveContainer" containerID="386b0e4df84705e33b62dcd6c26a4882c64afa61356cf7ec04674a1990d9aa7d" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.664728 4962 scope.go:117] "RemoveContainer" containerID="12f3d9673259981e6776e51544fe5efb5ce1da57a8056cee9f5b866b2fef856a" Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.682342 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f5596d8dc-rzq76"] Oct 03 14:25:47 crc kubenswrapper[4962]: I1003 14:25:47.689158 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f5596d8dc-rzq76"] Oct 03 14:25:48 crc kubenswrapper[4962]: I1003 14:25:48.239680 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" path="/var/lib/kubelet/pods/7034962f-7ee1-44f1-8dba-bae0bdb8911d/volumes" Oct 03 14:25:54 crc kubenswrapper[4962]: I1003 14:25:54.660034 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:25:54 crc kubenswrapper[4962]: I1003 14:25:54.660651 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:26:13 crc kubenswrapper[4962]: I1003 14:26:13.273747 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:26:13 crc kubenswrapper[4962]: I1003 14:26:13.275837 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58c9cc65fb-rvbds" Oct 03 14:26:24 crc kubenswrapper[4962]: I1003 14:26:24.659445 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:26:24 crc kubenswrapper[4962]: I1003 14:26:24.659986 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:26:24 crc kubenswrapper[4962]: I1003 14:26:24.660040 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:26:24 crc kubenswrapper[4962]: I1003 14:26:24.660874 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:26:24 crc kubenswrapper[4962]: I1003 14:26:24.660936 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" gracePeriod=600 Oct 03 14:26:25 crc kubenswrapper[4962]: E1003 14:26:25.333897 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:26:25 crc kubenswrapper[4962]: I1003 14:26:25.961120 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" exitCode=0 Oct 03 14:26:25 crc kubenswrapper[4962]: I1003 14:26:25.961434 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db"} Oct 03 14:26:25 crc kubenswrapper[4962]: I1003 14:26:25.961468 4962 scope.go:117] "RemoveContainer" containerID="66e15d378b389fff3661c27f240d5e7dda70a518dfc3f92072db88df889fc781" Oct 03 14:26:25 crc kubenswrapper[4962]: I1003 14:26:25.962069 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:26:25 crc kubenswrapper[4962]: E1003 14:26:25.962279 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:26:33 crc kubenswrapper[4962]: I1003 14:26:33.976966 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fvgd9"] Oct 03 14:26:33 crc kubenswrapper[4962]: E1003 14:26:33.978537 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" containerName="dnsmasq-dns" Oct 03 14:26:33 crc kubenswrapper[4962]: I1003 14:26:33.978575 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" containerName="dnsmasq-dns" Oct 03 14:26:33 crc kubenswrapper[4962]: E1003 14:26:33.978637 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" containerName="init" Oct 03 14:26:33 crc kubenswrapper[4962]: I1003 14:26:33.978660 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" containerName="init" Oct 03 14:26:33 crc kubenswrapper[4962]: I1003 14:26:33.978874 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7034962f-7ee1-44f1-8dba-bae0bdb8911d" containerName="dnsmasq-dns" Oct 03 14:26:33 crc kubenswrapper[4962]: I1003 14:26:33.979614 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fvgd9" Oct 03 14:26:33 crc kubenswrapper[4962]: I1003 14:26:33.993598 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fvgd9"] Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.075271 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-wrfxq"] Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.076990 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wrfxq" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.085330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ntl5\" (UniqueName: \"kubernetes.io/projected/10ad472c-890f-4720-bd48-5b637f821344-kube-api-access-4ntl5\") pod \"nova-api-db-create-fvgd9\" (UID: \"10ad472c-890f-4720-bd48-5b637f821344\") " pod="openstack/nova-api-db-create-fvgd9" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.100564 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wrfxq"] Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.169077 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dmq9v"] Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.170733 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dmq9v" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.178692 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dmq9v"] Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.186787 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx7f6\" (UniqueName: \"kubernetes.io/projected/6b03ca85-b639-4480-b117-ba6dce52030f-kube-api-access-tx7f6\") pod \"nova-cell0-db-create-wrfxq\" (UID: \"6b03ca85-b639-4480-b117-ba6dce52030f\") " pod="openstack/nova-cell0-db-create-wrfxq" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.186868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ntl5\" (UniqueName: \"kubernetes.io/projected/10ad472c-890f-4720-bd48-5b637f821344-kube-api-access-4ntl5\") pod \"nova-api-db-create-fvgd9\" (UID: \"10ad472c-890f-4720-bd48-5b637f821344\") " pod="openstack/nova-api-db-create-fvgd9" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.220178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ntl5\" (UniqueName: \"kubernetes.io/projected/10ad472c-890f-4720-bd48-5b637f821344-kube-api-access-4ntl5\") pod \"nova-api-db-create-fvgd9\" (UID: \"10ad472c-890f-4720-bd48-5b637f821344\") " pod="openstack/nova-api-db-create-fvgd9" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.289506 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxl29\" (UniqueName: \"kubernetes.io/projected/b590b3c6-3b78-42fa-af59-6c7379135360-kube-api-access-rxl29\") pod \"nova-cell1-db-create-dmq9v\" (UID: \"b590b3c6-3b78-42fa-af59-6c7379135360\") " pod="openstack/nova-cell1-db-create-dmq9v" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.289576 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx7f6\" (UniqueName: \"kubernetes.io/projected/6b03ca85-b639-4480-b117-ba6dce52030f-kube-api-access-tx7f6\") pod \"nova-cell0-db-create-wrfxq\" (UID: \"6b03ca85-b639-4480-b117-ba6dce52030f\") " pod="openstack/nova-cell0-db-create-wrfxq" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.301466 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fvgd9" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.320623 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx7f6\" (UniqueName: \"kubernetes.io/projected/6b03ca85-b639-4480-b117-ba6dce52030f-kube-api-access-tx7f6\") pod \"nova-cell0-db-create-wrfxq\" (UID: \"6b03ca85-b639-4480-b117-ba6dce52030f\") " pod="openstack/nova-cell0-db-create-wrfxq" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.392266 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxl29\" (UniqueName: \"kubernetes.io/projected/b590b3c6-3b78-42fa-af59-6c7379135360-kube-api-access-rxl29\") pod \"nova-cell1-db-create-dmq9v\" (UID: \"b590b3c6-3b78-42fa-af59-6c7379135360\") " pod="openstack/nova-cell1-db-create-dmq9v" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.398184 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wrfxq" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.413405 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxl29\" (UniqueName: \"kubernetes.io/projected/b590b3c6-3b78-42fa-af59-6c7379135360-kube-api-access-rxl29\") pod \"nova-cell1-db-create-dmq9v\" (UID: \"b590b3c6-3b78-42fa-af59-6c7379135360\") " pod="openstack/nova-cell1-db-create-dmq9v" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.494108 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dmq9v" Oct 03 14:26:34 crc kubenswrapper[4962]: I1003 14:26:34.833018 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fvgd9"] Oct 03 14:26:35 crc kubenswrapper[4962]: W1003 14:26:34.846885 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10ad472c_890f_4720_bd48_5b637f821344.slice/crio-16f296e5ba988ff2d055f48d5baf6273ca7ecb8b3699ceee954805c77f7a1c83 WatchSource:0}: Error finding container 16f296e5ba988ff2d055f48d5baf6273ca7ecb8b3699ceee954805c77f7a1c83: Status 404 returned error can't find the container with id 16f296e5ba988ff2d055f48d5baf6273ca7ecb8b3699ceee954805c77f7a1c83 Oct 03 14:26:35 crc kubenswrapper[4962]: I1003 14:26:34.976683 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wrfxq"] Oct 03 14:26:35 crc kubenswrapper[4962]: I1003 14:26:35.064872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fvgd9" event={"ID":"10ad472c-890f-4720-bd48-5b637f821344","Type":"ContainerStarted","Data":"16f296e5ba988ff2d055f48d5baf6273ca7ecb8b3699ceee954805c77f7a1c83"} Oct 03 14:26:35 crc kubenswrapper[4962]: I1003 14:26:35.066218 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dmq9v"] Oct 03 14:26:35 crc kubenswrapper[4962]: I1003 14:26:35.067295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wrfxq" event={"ID":"6b03ca85-b639-4480-b117-ba6dce52030f","Type":"ContainerStarted","Data":"f60a514458d70e140bf7c995ed8dcd786f151dc6a07e39e3efcdb0b086bf7109"} Oct 03 14:26:36 crc kubenswrapper[4962]: I1003 14:26:36.078511 4962 generic.go:334] "Generic (PLEG): container finished" podID="10ad472c-890f-4720-bd48-5b637f821344" containerID="c702f3912c0c0cff4948df79d888ae7553e08c96d52bdd60e2342cfbe37f4d9a" exitCode=0 Oct 03 14:26:36 crc kubenswrapper[4962]: I1003 14:26:36.078616 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fvgd9" event={"ID":"10ad472c-890f-4720-bd48-5b637f821344","Type":"ContainerDied","Data":"c702f3912c0c0cff4948df79d888ae7553e08c96d52bdd60e2342cfbe37f4d9a"} Oct 03 14:26:36 crc kubenswrapper[4962]: I1003 14:26:36.081295 4962 generic.go:334] "Generic (PLEG): container finished" podID="6b03ca85-b639-4480-b117-ba6dce52030f" containerID="f554efcb536bb4eceaaf716ce4565fc501591bb77fa5cc344c79e6d980d9d6ee" exitCode=0 Oct 03 14:26:36 crc kubenswrapper[4962]: I1003 14:26:36.081381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wrfxq" event={"ID":"6b03ca85-b639-4480-b117-ba6dce52030f","Type":"ContainerDied","Data":"f554efcb536bb4eceaaf716ce4565fc501591bb77fa5cc344c79e6d980d9d6ee"} Oct 03 14:26:36 crc kubenswrapper[4962]: I1003 14:26:36.083785 4962 generic.go:334] "Generic (PLEG): container finished" podID="b590b3c6-3b78-42fa-af59-6c7379135360" containerID="16779953fa726f8ce1e3ad8375f5fcbe2341d0d825b43a044b374ccba5cd87c0" exitCode=0 Oct 03 14:26:36 crc kubenswrapper[4962]: I1003 14:26:36.083830 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dmq9v" event={"ID":"b590b3c6-3b78-42fa-af59-6c7379135360","Type":"ContainerDied","Data":"16779953fa726f8ce1e3ad8375f5fcbe2341d0d825b43a044b374ccba5cd87c0"} Oct 03 14:26:36 crc kubenswrapper[4962]: I1003 14:26:36.083857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dmq9v" event={"ID":"b590b3c6-3b78-42fa-af59-6c7379135360","Type":"ContainerStarted","Data":"9818a6403e94c463f38c713fbf8c723f7543f434cea498660015cd6dba3f259b"} Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.521002 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fvgd9" Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.536494 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wrfxq" Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.550911 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dmq9v" Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.658059 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxl29\" (UniqueName: \"kubernetes.io/projected/b590b3c6-3b78-42fa-af59-6c7379135360-kube-api-access-rxl29\") pod \"b590b3c6-3b78-42fa-af59-6c7379135360\" (UID: \"b590b3c6-3b78-42fa-af59-6c7379135360\") " Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.658203 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx7f6\" (UniqueName: \"kubernetes.io/projected/6b03ca85-b639-4480-b117-ba6dce52030f-kube-api-access-tx7f6\") pod \"6b03ca85-b639-4480-b117-ba6dce52030f\" (UID: \"6b03ca85-b639-4480-b117-ba6dce52030f\") " Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.658368 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ntl5\" (UniqueName: \"kubernetes.io/projected/10ad472c-890f-4720-bd48-5b637f821344-kube-api-access-4ntl5\") pod \"10ad472c-890f-4720-bd48-5b637f821344\" (UID: \"10ad472c-890f-4720-bd48-5b637f821344\") " Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.664457 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ad472c-890f-4720-bd48-5b637f821344-kube-api-access-4ntl5" (OuterVolumeSpecName: "kube-api-access-4ntl5") pod "10ad472c-890f-4720-bd48-5b637f821344" (UID: "10ad472c-890f-4720-bd48-5b637f821344"). InnerVolumeSpecName "kube-api-access-4ntl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.664539 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b03ca85-b639-4480-b117-ba6dce52030f-kube-api-access-tx7f6" (OuterVolumeSpecName: "kube-api-access-tx7f6") pod "6b03ca85-b639-4480-b117-ba6dce52030f" (UID: "6b03ca85-b639-4480-b117-ba6dce52030f"). InnerVolumeSpecName "kube-api-access-tx7f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.664799 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b590b3c6-3b78-42fa-af59-6c7379135360-kube-api-access-rxl29" (OuterVolumeSpecName: "kube-api-access-rxl29") pod "b590b3c6-3b78-42fa-af59-6c7379135360" (UID: "b590b3c6-3b78-42fa-af59-6c7379135360"). InnerVolumeSpecName "kube-api-access-rxl29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.760495 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx7f6\" (UniqueName: \"kubernetes.io/projected/6b03ca85-b639-4480-b117-ba6dce52030f-kube-api-access-tx7f6\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.760540 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ntl5\" (UniqueName: \"kubernetes.io/projected/10ad472c-890f-4720-bd48-5b637f821344-kube-api-access-4ntl5\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:37 crc kubenswrapper[4962]: I1003 14:26:37.760550 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxl29\" (UniqueName: \"kubernetes.io/projected/b590b3c6-3b78-42fa-af59-6c7379135360-kube-api-access-rxl29\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.105891 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fvgd9" event={"ID":"10ad472c-890f-4720-bd48-5b637f821344","Type":"ContainerDied","Data":"16f296e5ba988ff2d055f48d5baf6273ca7ecb8b3699ceee954805c77f7a1c83"} Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.105951 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16f296e5ba988ff2d055f48d5baf6273ca7ecb8b3699ceee954805c77f7a1c83" Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.106010 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fvgd9" Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.108676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wrfxq" event={"ID":"6b03ca85-b639-4480-b117-ba6dce52030f","Type":"ContainerDied","Data":"f60a514458d70e140bf7c995ed8dcd786f151dc6a07e39e3efcdb0b086bf7109"} Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.108720 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60a514458d70e140bf7c995ed8dcd786f151dc6a07e39e3efcdb0b086bf7109" Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.108724 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wrfxq" Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.110266 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dmq9v" event={"ID":"b590b3c6-3b78-42fa-af59-6c7379135360","Type":"ContainerDied","Data":"9818a6403e94c463f38c713fbf8c723f7543f434cea498660015cd6dba3f259b"} Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.110302 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9818a6403e94c463f38c713fbf8c723f7543f434cea498660015cd6dba3f259b" Oct 03 14:26:38 crc kubenswrapper[4962]: I1003 14:26:38.110315 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dmq9v" Oct 03 14:26:39 crc kubenswrapper[4962]: I1003 14:26:39.227906 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:26:39 crc kubenswrapper[4962]: E1003 14:26:39.228718 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.214187 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4c80-account-create-pjt4h"] Oct 03 14:26:44 crc kubenswrapper[4962]: E1003 14:26:44.215454 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b590b3c6-3b78-42fa-af59-6c7379135360" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.215474 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b590b3c6-3b78-42fa-af59-6c7379135360" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: E1003 14:26:44.215496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b03ca85-b639-4480-b117-ba6dce52030f" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.215506 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b03ca85-b639-4480-b117-ba6dce52030f" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: E1003 14:26:44.215543 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ad472c-890f-4720-bd48-5b637f821344" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.215553 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ad472c-890f-4720-bd48-5b637f821344" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.215899 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ad472c-890f-4720-bd48-5b637f821344" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.215930 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b590b3c6-3b78-42fa-af59-6c7379135360" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.215956 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b03ca85-b639-4480-b117-ba6dce52030f" containerName="mariadb-database-create" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.219169 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4c80-account-create-pjt4h" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.222628 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.246029 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4c80-account-create-pjt4h"] Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.320792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5pwd\" (UniqueName: \"kubernetes.io/projected/559da102-fa5c-4c60-89e9-e54596f5172a-kube-api-access-p5pwd\") pod \"nova-api-4c80-account-create-pjt4h\" (UID: \"559da102-fa5c-4c60-89e9-e54596f5172a\") " pod="openstack/nova-api-4c80-account-create-pjt4h" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.399914 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-aaaf-account-create-vzmss"] Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.401046 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aaaf-account-create-vzmss" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.407170 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.414348 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aaaf-account-create-vzmss"] Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.426459 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5pwd\" (UniqueName: \"kubernetes.io/projected/559da102-fa5c-4c60-89e9-e54596f5172a-kube-api-access-p5pwd\") pod \"nova-api-4c80-account-create-pjt4h\" (UID: \"559da102-fa5c-4c60-89e9-e54596f5172a\") " pod="openstack/nova-api-4c80-account-create-pjt4h" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.448172 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5pwd\" (UniqueName: \"kubernetes.io/projected/559da102-fa5c-4c60-89e9-e54596f5172a-kube-api-access-p5pwd\") pod \"nova-api-4c80-account-create-pjt4h\" (UID: \"559da102-fa5c-4c60-89e9-e54596f5172a\") " pod="openstack/nova-api-4c80-account-create-pjt4h" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.499956 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-832c-account-create-7q5x2"] Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.501900 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-832c-account-create-7q5x2" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.504614 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.521159 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-832c-account-create-7q5x2"] Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.528815 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzjc8\" (UniqueName: \"kubernetes.io/projected/cc15827a-60c3-4aba-bdbe-234d52d256b1-kube-api-access-wzjc8\") pod \"nova-cell0-aaaf-account-create-vzmss\" (UID: \"cc15827a-60c3-4aba-bdbe-234d52d256b1\") " pod="openstack/nova-cell0-aaaf-account-create-vzmss" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.551345 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4c80-account-create-pjt4h" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.631873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4scd\" (UniqueName: \"kubernetes.io/projected/5989ecf4-c65d-4310-940c-468b5d2fa698-kube-api-access-m4scd\") pod \"nova-cell1-832c-account-create-7q5x2\" (UID: \"5989ecf4-c65d-4310-940c-468b5d2fa698\") " pod="openstack/nova-cell1-832c-account-create-7q5x2" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.632459 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzjc8\" (UniqueName: \"kubernetes.io/projected/cc15827a-60c3-4aba-bdbe-234d52d256b1-kube-api-access-wzjc8\") pod \"nova-cell0-aaaf-account-create-vzmss\" (UID: \"cc15827a-60c3-4aba-bdbe-234d52d256b1\") " pod="openstack/nova-cell0-aaaf-account-create-vzmss" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.650021 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzjc8\" (UniqueName: \"kubernetes.io/projected/cc15827a-60c3-4aba-bdbe-234d52d256b1-kube-api-access-wzjc8\") pod \"nova-cell0-aaaf-account-create-vzmss\" (UID: \"cc15827a-60c3-4aba-bdbe-234d52d256b1\") " pod="openstack/nova-cell0-aaaf-account-create-vzmss" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.725842 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aaaf-account-create-vzmss" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.734936 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4scd\" (UniqueName: \"kubernetes.io/projected/5989ecf4-c65d-4310-940c-468b5d2fa698-kube-api-access-m4scd\") pod \"nova-cell1-832c-account-create-7q5x2\" (UID: \"5989ecf4-c65d-4310-940c-468b5d2fa698\") " pod="openstack/nova-cell1-832c-account-create-7q5x2" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.753153 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4scd\" (UniqueName: \"kubernetes.io/projected/5989ecf4-c65d-4310-940c-468b5d2fa698-kube-api-access-m4scd\") pod \"nova-cell1-832c-account-create-7q5x2\" (UID: \"5989ecf4-c65d-4310-940c-468b5d2fa698\") " pod="openstack/nova-cell1-832c-account-create-7q5x2" Oct 03 14:26:44 crc kubenswrapper[4962]: I1003 14:26:44.823046 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-832c-account-create-7q5x2" Oct 03 14:26:45 crc kubenswrapper[4962]: I1003 14:26:45.008428 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4c80-account-create-pjt4h"] Oct 03 14:26:45 crc kubenswrapper[4962]: I1003 14:26:45.187452 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4c80-account-create-pjt4h" event={"ID":"559da102-fa5c-4c60-89e9-e54596f5172a","Type":"ContainerStarted","Data":"d206c3a30b41869a30ce5703d15f0431d1b231f66a4b3ce033ac7d69f4a73584"} Oct 03 14:26:45 crc kubenswrapper[4962]: I1003 14:26:45.213479 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aaaf-account-create-vzmss"] Oct 03 14:26:45 crc kubenswrapper[4962]: I1003 14:26:45.260261 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-832c-account-create-7q5x2"] Oct 03 14:26:45 crc kubenswrapper[4962]: W1003 14:26:45.276967 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5989ecf4_c65d_4310_940c_468b5d2fa698.slice/crio-878222d6991796f834b3841509bc69b4ced8d693824107a662875376ba86751e WatchSource:0}: Error finding container 878222d6991796f834b3841509bc69b4ced8d693824107a662875376ba86751e: Status 404 returned error can't find the container with id 878222d6991796f834b3841509bc69b4ced8d693824107a662875376ba86751e Oct 03 14:26:46 crc kubenswrapper[4962]: I1003 14:26:46.201484 4962 generic.go:334] "Generic (PLEG): container finished" podID="cc15827a-60c3-4aba-bdbe-234d52d256b1" containerID="53f5ba95097b5b29609cca88f6f3efa13dfa1340cfb71dc5f503acaa90d3118b" exitCode=0 Oct 03 14:26:46 crc kubenswrapper[4962]: I1003 14:26:46.201713 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aaaf-account-create-vzmss" event={"ID":"cc15827a-60c3-4aba-bdbe-234d52d256b1","Type":"ContainerDied","Data":"53f5ba95097b5b29609cca88f6f3efa13dfa1340cfb71dc5f503acaa90d3118b"} Oct 03 14:26:46 crc kubenswrapper[4962]: I1003 14:26:46.201941 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aaaf-account-create-vzmss" event={"ID":"cc15827a-60c3-4aba-bdbe-234d52d256b1","Type":"ContainerStarted","Data":"8cb54393b593e9a42cbf029c31680e07897a32c692eed4a671be85b90d1cca48"} Oct 03 14:26:46 crc kubenswrapper[4962]: I1003 14:26:46.204052 4962 generic.go:334] "Generic (PLEG): container finished" podID="5989ecf4-c65d-4310-940c-468b5d2fa698" containerID="15b6ba7acc08c937f4afe7c75cba7bb92208c523af552bffd7588150c9a439e1" exitCode=0 Oct 03 14:26:46 crc kubenswrapper[4962]: I1003 14:26:46.204105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-832c-account-create-7q5x2" event={"ID":"5989ecf4-c65d-4310-940c-468b5d2fa698","Type":"ContainerDied","Data":"15b6ba7acc08c937f4afe7c75cba7bb92208c523af552bffd7588150c9a439e1"} Oct 03 14:26:46 crc kubenswrapper[4962]: I1003 14:26:46.204125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-832c-account-create-7q5x2" event={"ID":"5989ecf4-c65d-4310-940c-468b5d2fa698","Type":"ContainerStarted","Data":"878222d6991796f834b3841509bc69b4ced8d693824107a662875376ba86751e"} Oct 03 14:26:46 crc kubenswrapper[4962]: I1003 14:26:46.207050 4962 generic.go:334] "Generic (PLEG): container finished" podID="559da102-fa5c-4c60-89e9-e54596f5172a" containerID="2c8060d6ad024cbfc6ed48bdca3d180d8f2b87f439a1208004eeddff9c7aa97c" exitCode=0 Oct 03 14:26:46 crc kubenswrapper[4962]: I1003 14:26:46.207117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4c80-account-create-pjt4h" event={"ID":"559da102-fa5c-4c60-89e9-e54596f5172a","Type":"ContainerDied","Data":"2c8060d6ad024cbfc6ed48bdca3d180d8f2b87f439a1208004eeddff9c7aa97c"} Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.672278 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aaaf-account-create-vzmss" Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.677886 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-832c-account-create-7q5x2" Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.688782 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4c80-account-create-pjt4h" Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.794524 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5pwd\" (UniqueName: \"kubernetes.io/projected/559da102-fa5c-4c60-89e9-e54596f5172a-kube-api-access-p5pwd\") pod \"559da102-fa5c-4c60-89e9-e54596f5172a\" (UID: \"559da102-fa5c-4c60-89e9-e54596f5172a\") " Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.794608 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4scd\" (UniqueName: \"kubernetes.io/projected/5989ecf4-c65d-4310-940c-468b5d2fa698-kube-api-access-m4scd\") pod \"5989ecf4-c65d-4310-940c-468b5d2fa698\" (UID: \"5989ecf4-c65d-4310-940c-468b5d2fa698\") " Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.794714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzjc8\" (UniqueName: \"kubernetes.io/projected/cc15827a-60c3-4aba-bdbe-234d52d256b1-kube-api-access-wzjc8\") pod \"cc15827a-60c3-4aba-bdbe-234d52d256b1\" (UID: \"cc15827a-60c3-4aba-bdbe-234d52d256b1\") " Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.800208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc15827a-60c3-4aba-bdbe-234d52d256b1-kube-api-access-wzjc8" (OuterVolumeSpecName: "kube-api-access-wzjc8") pod "cc15827a-60c3-4aba-bdbe-234d52d256b1" (UID: "cc15827a-60c3-4aba-bdbe-234d52d256b1"). InnerVolumeSpecName "kube-api-access-wzjc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.801562 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559da102-fa5c-4c60-89e9-e54596f5172a-kube-api-access-p5pwd" (OuterVolumeSpecName: "kube-api-access-p5pwd") pod "559da102-fa5c-4c60-89e9-e54596f5172a" (UID: "559da102-fa5c-4c60-89e9-e54596f5172a"). InnerVolumeSpecName "kube-api-access-p5pwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.820830 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5989ecf4-c65d-4310-940c-468b5d2fa698-kube-api-access-m4scd" (OuterVolumeSpecName: "kube-api-access-m4scd") pod "5989ecf4-c65d-4310-940c-468b5d2fa698" (UID: "5989ecf4-c65d-4310-940c-468b5d2fa698"). InnerVolumeSpecName "kube-api-access-m4scd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.896572 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5pwd\" (UniqueName: \"kubernetes.io/projected/559da102-fa5c-4c60-89e9-e54596f5172a-kube-api-access-p5pwd\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.896609 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4scd\" (UniqueName: \"kubernetes.io/projected/5989ecf4-c65d-4310-940c-468b5d2fa698-kube-api-access-m4scd\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:47 crc kubenswrapper[4962]: I1003 14:26:47.896618 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzjc8\" (UniqueName: \"kubernetes.io/projected/cc15827a-60c3-4aba-bdbe-234d52d256b1-kube-api-access-wzjc8\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.238629 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-832c-account-create-7q5x2" Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.241203 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4c80-account-create-pjt4h" Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.244263 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-832c-account-create-7q5x2" event={"ID":"5989ecf4-c65d-4310-940c-468b5d2fa698","Type":"ContainerDied","Data":"878222d6991796f834b3841509bc69b4ced8d693824107a662875376ba86751e"} Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.244335 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="878222d6991796f834b3841509bc69b4ced8d693824107a662875376ba86751e" Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.244358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4c80-account-create-pjt4h" event={"ID":"559da102-fa5c-4c60-89e9-e54596f5172a","Type":"ContainerDied","Data":"d206c3a30b41869a30ce5703d15f0431d1b231f66a4b3ce033ac7d69f4a73584"} Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.244380 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d206c3a30b41869a30ce5703d15f0431d1b231f66a4b3ce033ac7d69f4a73584" Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.244469 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aaaf-account-create-vzmss" event={"ID":"cc15827a-60c3-4aba-bdbe-234d52d256b1","Type":"ContainerDied","Data":"8cb54393b593e9a42cbf029c31680e07897a32c692eed4a671be85b90d1cca48"} Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.244556 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cb54393b593e9a42cbf029c31680e07897a32c692eed4a671be85b90d1cca48" Oct 03 14:26:48 crc kubenswrapper[4962]: I1003 14:26:48.244599 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aaaf-account-create-vzmss" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.794464 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qc5nh"] Oct 03 14:26:49 crc kubenswrapper[4962]: E1003 14:26:49.795165 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5989ecf4-c65d-4310-940c-468b5d2fa698" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.795185 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5989ecf4-c65d-4310-940c-468b5d2fa698" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: E1003 14:26:49.795205 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc15827a-60c3-4aba-bdbe-234d52d256b1" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.795213 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc15827a-60c3-4aba-bdbe-234d52d256b1" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: E1003 14:26:49.795239 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559da102-fa5c-4c60-89e9-e54596f5172a" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.795250 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="559da102-fa5c-4c60-89e9-e54596f5172a" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.795446 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5989ecf4-c65d-4310-940c-468b5d2fa698" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.795463 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="559da102-fa5c-4c60-89e9-e54596f5172a" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.795477 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc15827a-60c3-4aba-bdbe-234d52d256b1" containerName="mariadb-account-create" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.796177 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.798625 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.798835 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.799607 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-spl5x" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.820945 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qc5nh"] Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.940485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-config-data\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.941168 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/d13977e9-fabe-4c69-a8dc-18f841d73e6c-kube-api-access-fh4m5\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.941357 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:49 crc kubenswrapper[4962]: I1003 14:26:49.941632 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-scripts\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.044385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-scripts\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.044488 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-config-data\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.044541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/d13977e9-fabe-4c69-a8dc-18f841d73e6c-kube-api-access-fh4m5\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.044602 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.050080 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-config-data\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.050085 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-scripts\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.050173 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.062117 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/d13977e9-fabe-4c69-a8dc-18f841d73e6c-kube-api-access-fh4m5\") pod \"nova-cell0-conductor-db-sync-qc5nh\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.128851 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:50 crc kubenswrapper[4962]: I1003 14:26:50.643972 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qc5nh"] Oct 03 14:26:51 crc kubenswrapper[4962]: I1003 14:26:51.281570 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" event={"ID":"d13977e9-fabe-4c69-a8dc-18f841d73e6c","Type":"ContainerStarted","Data":"06f8a0e8d62bca357dedef5d867dfbe054c77020f2a02096f7d8a89930584f77"} Oct 03 14:26:51 crc kubenswrapper[4962]: I1003 14:26:51.282141 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" event={"ID":"d13977e9-fabe-4c69-a8dc-18f841d73e6c","Type":"ContainerStarted","Data":"3035b616f5d95dad3b478374f88a60735019c55f1986885a7e7a9078ac6dc347"} Oct 03 14:26:51 crc kubenswrapper[4962]: I1003 14:26:51.299520 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" podStartSLOduration=2.299505422 podStartE2EDuration="2.299505422s" podCreationTimestamp="2025-10-03 14:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:26:51.296851343 +0000 UTC m=+5819.700749178" watchObservedRunningTime="2025-10-03 14:26:51.299505422 +0000 UTC m=+5819.703403257" Oct 03 14:26:53 crc kubenswrapper[4962]: I1003 14:26:53.227837 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:26:53 crc kubenswrapper[4962]: E1003 14:26:53.228821 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:26:56 crc kubenswrapper[4962]: I1003 14:26:56.322535 4962 generic.go:334] "Generic (PLEG): container finished" podID="d13977e9-fabe-4c69-a8dc-18f841d73e6c" containerID="06f8a0e8d62bca357dedef5d867dfbe054c77020f2a02096f7d8a89930584f77" exitCode=0 Oct 03 14:26:56 crc kubenswrapper[4962]: I1003 14:26:56.322625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" event={"ID":"d13977e9-fabe-4c69-a8dc-18f841d73e6c","Type":"ContainerDied","Data":"06f8a0e8d62bca357dedef5d867dfbe054c77020f2a02096f7d8a89930584f77"} Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.615726 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.692242 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/d13977e9-fabe-4c69-a8dc-18f841d73e6c-kube-api-access-fh4m5\") pod \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.692386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-combined-ca-bundle\") pod \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.692561 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-config-data\") pod \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.692615 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-scripts\") pod \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\" (UID: \"d13977e9-fabe-4c69-a8dc-18f841d73e6c\") " Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.699602 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13977e9-fabe-4c69-a8dc-18f841d73e6c-kube-api-access-fh4m5" (OuterVolumeSpecName: "kube-api-access-fh4m5") pod "d13977e9-fabe-4c69-a8dc-18f841d73e6c" (UID: "d13977e9-fabe-4c69-a8dc-18f841d73e6c"). InnerVolumeSpecName "kube-api-access-fh4m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.700017 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-scripts" (OuterVolumeSpecName: "scripts") pod "d13977e9-fabe-4c69-a8dc-18f841d73e6c" (UID: "d13977e9-fabe-4c69-a8dc-18f841d73e6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.718001 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-config-data" (OuterVolumeSpecName: "config-data") pod "d13977e9-fabe-4c69-a8dc-18f841d73e6c" (UID: "d13977e9-fabe-4c69-a8dc-18f841d73e6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.723361 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d13977e9-fabe-4c69-a8dc-18f841d73e6c" (UID: "d13977e9-fabe-4c69-a8dc-18f841d73e6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.794868 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.794901 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.794912 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh4m5\" (UniqueName: \"kubernetes.io/projected/d13977e9-fabe-4c69-a8dc-18f841d73e6c-kube-api-access-fh4m5\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:57 crc kubenswrapper[4962]: I1003 14:26:57.794922 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13977e9-fabe-4c69-a8dc-18f841d73e6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.339589 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" event={"ID":"d13977e9-fabe-4c69-a8dc-18f841d73e6c","Type":"ContainerDied","Data":"3035b616f5d95dad3b478374f88a60735019c55f1986885a7e7a9078ac6dc347"} Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.339627 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3035b616f5d95dad3b478374f88a60735019c55f1986885a7e7a9078ac6dc347" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.339773 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qc5nh" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.408620 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:26:58 crc kubenswrapper[4962]: E1003 14:26:58.409112 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13977e9-fabe-4c69-a8dc-18f841d73e6c" containerName="nova-cell0-conductor-db-sync" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.409137 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13977e9-fabe-4c69-a8dc-18f841d73e6c" containerName="nova-cell0-conductor-db-sync" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.409305 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13977e9-fabe-4c69-a8dc-18f841d73e6c" containerName="nova-cell0-conductor-db-sync" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.409922 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.411681 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.412063 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-spl5x" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.416286 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.507609 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.508074 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.508239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b9jg\" (UniqueName: \"kubernetes.io/projected/8361c760-4e3d-46d1-b1ad-73826855e693-kube-api-access-5b9jg\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.610812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.611155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.611325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b9jg\" (UniqueName: \"kubernetes.io/projected/8361c760-4e3d-46d1-b1ad-73826855e693-kube-api-access-5b9jg\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.615348 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.622119 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.628279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b9jg\" (UniqueName: \"kubernetes.io/projected/8361c760-4e3d-46d1-b1ad-73826855e693-kube-api-access-5b9jg\") pod \"nova-cell0-conductor-0\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:58 crc kubenswrapper[4962]: I1003 14:26:58.734285 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:59 crc kubenswrapper[4962]: I1003 14:26:59.149413 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:26:59 crc kubenswrapper[4962]: I1003 14:26:59.349905 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8361c760-4e3d-46d1-b1ad-73826855e693","Type":"ContainerStarted","Data":"a392e7f454efaf115628bada22dcfbb70d683b083922bfef80689ff4bec558f7"} Oct 03 14:26:59 crc kubenswrapper[4962]: I1003 14:26:59.350365 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8361c760-4e3d-46d1-b1ad-73826855e693","Type":"ContainerStarted","Data":"461e5f8c9f9577b5cd331f0dbb825914569f58c56f394cfa53f8ede1f0ec3137"} Oct 03 14:26:59 crc kubenswrapper[4962]: I1003 14:26:59.350430 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 14:26:59 crc kubenswrapper[4962]: I1003 14:26:59.363582 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.363565266 podStartE2EDuration="1.363565266s" podCreationTimestamp="2025-10-03 14:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:26:59.361290667 +0000 UTC m=+5827.765188512" watchObservedRunningTime="2025-10-03 14:26:59.363565266 +0000 UTC m=+5827.767463101" Oct 03 14:27:01 crc kubenswrapper[4962]: I1003 14:27:01.031511 4962 scope.go:117] "RemoveContainer" containerID="2d3d3b215770e96653eccb58336c037e1515850ead95ae01bcff3479a9a85113" Oct 03 14:27:08 crc kubenswrapper[4962]: I1003 14:27:08.226911 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:27:08 crc kubenswrapper[4962]: E1003 14:27:08.227398 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:27:08 crc kubenswrapper[4962]: I1003 14:27:08.758243 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.314264 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bvvxc"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.316469 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.318805 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.319174 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.322109 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bvvxc"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.396923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.396986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-config-data\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.397038 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbckj\" (UniqueName: \"kubernetes.io/projected/8608a2df-334e-4b2c-a93d-05276e2afe0f-kube-api-access-jbckj\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.397078 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-scripts\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.441095 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.442860 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.446172 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.465468 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.498830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.498887 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-config-data\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.498916 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-config-data\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.498944 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbckj\" (UniqueName: \"kubernetes.io/projected/8608a2df-334e-4b2c-a93d-05276e2afe0f-kube-api-access-jbckj\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.498987 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.499010 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-scripts\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.499042 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dr7g\" (UniqueName: \"kubernetes.io/projected/fb7089a7-df54-44fd-b9b0-59182a38fbd6-kube-api-access-7dr7g\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.506187 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-scripts\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.508420 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.510531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-config-data\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.544340 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbckj\" (UniqueName: \"kubernetes.io/projected/8608a2df-334e-4b2c-a93d-05276e2afe0f-kube-api-access-jbckj\") pod \"nova-cell0-cell-mapping-bvvxc\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.585771 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.587187 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.595292 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.600491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-config-data\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.601196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.601322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dr7g\" (UniqueName: \"kubernetes.io/projected/fb7089a7-df54-44fd-b9b0-59182a38fbd6-kube-api-access-7dr7g\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.607850 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-config-data\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.617791 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.628105 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.634995 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dr7g\" (UniqueName: \"kubernetes.io/projected/fb7089a7-df54-44fd-b9b0-59182a38fbd6-kube-api-access-7dr7g\") pod \"nova-scheduler-0\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.642863 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.650884 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.652119 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.662152 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.682760 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.705135 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.705187 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701eee88-43d7-4337-b9fc-ebf93df71fd6-logs\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.705221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfmzj\" (UniqueName: \"kubernetes.io/projected/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-kube-api-access-tfmzj\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.705297 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjttd\" (UniqueName: \"kubernetes.io/projected/701eee88-43d7-4337-b9fc-ebf93df71fd6-kube-api-access-hjttd\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.705321 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-config-data\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.705365 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.705390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.757930 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.759710 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.780759 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fc6d855cf-g8lbz"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.782305 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.783998 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.789714 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.795738 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808252 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808320 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzp7q\" (UniqueName: \"kubernetes.io/projected/bb094c37-5f75-414d-816c-0a4af71943a5-kube-api-access-vzp7q\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808344 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808368 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701eee88-43d7-4337-b9fc-ebf93df71fd6-logs\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808419 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808448 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfmzj\" (UniqueName: \"kubernetes.io/projected/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-kube-api-access-tfmzj\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808493 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-config-data\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808518 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb094c37-5f75-414d-816c-0a4af71943a5-logs\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjttd\" (UniqueName: \"kubernetes.io/projected/701eee88-43d7-4337-b9fc-ebf93df71fd6-kube-api-access-hjttd\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.808573 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-config-data\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.809574 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701eee88-43d7-4337-b9fc-ebf93df71fd6-logs\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.823687 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-config-data\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.824396 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.835690 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fc6d855cf-g8lbz"] Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.837876 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.838431 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.857518 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjttd\" (UniqueName: \"kubernetes.io/projected/701eee88-43d7-4337-b9fc-ebf93df71fd6-kube-api-access-hjttd\") pod \"nova-metadata-0\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " pod="openstack/nova-metadata-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.882735 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfmzj\" (UniqueName: \"kubernetes.io/projected/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-kube-api-access-tfmzj\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-nb\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-config-data\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917669 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb094c37-5f75-414d-816c-0a4af71943a5-logs\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917715 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-config\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-dns-svc\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917810 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzp7q\" (UniqueName: \"kubernetes.io/projected/bb094c37-5f75-414d-816c-0a4af71943a5-kube-api-access-vzp7q\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917830 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-sb\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2tzc\" (UniqueName: \"kubernetes.io/projected/3077b379-ec45-4f83-9918-da7de38b871c-kube-api-access-c2tzc\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.917883 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.919276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb094c37-5f75-414d-816c-0a4af71943a5-logs\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.923410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-config-data\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.943498 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:09 crc kubenswrapper[4962]: I1003 14:27:09.972260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzp7q\" (UniqueName: \"kubernetes.io/projected/bb094c37-5f75-414d-816c-0a4af71943a5-kube-api-access-vzp7q\") pod \"nova-api-0\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " pod="openstack/nova-api-0" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.021378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-dns-svc\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.021430 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-sb\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.021453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tzc\" (UniqueName: \"kubernetes.io/projected/3077b379-ec45-4f83-9918-da7de38b871c-kube-api-access-c2tzc\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.021521 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-nb\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.021580 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-config\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.025723 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-config\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.028759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-sb\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.029276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-nb\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.032536 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-dns-svc\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.050228 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2tzc\" (UniqueName: \"kubernetes.io/projected/3077b379-ec45-4f83-9918-da7de38b871c-kube-api-access-c2tzc\") pod \"dnsmasq-dns-fc6d855cf-g8lbz\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.062322 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.101000 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.140374 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.183475 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.220426 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.385964 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bvvxc"] Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.445864 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb7089a7-df54-44fd-b9b0-59182a38fbd6","Type":"ContainerStarted","Data":"af1e8682c41a60a1533b861ddf7712d307193c6119f6392d22019fd467c44380"} Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.447554 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bvvxc" event={"ID":"8608a2df-334e-4b2c-a93d-05276e2afe0f","Type":"ContainerStarted","Data":"2d92ed08e7fb783c2f0f24a42efb3ad8f0f606d1648ac2220054c4ba471ba438"} Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.649277 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:27:10 crc kubenswrapper[4962]: W1003 14:27:10.652774 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c6c0e02_3da8_4bf8_9b04_8684a07876fa.slice/crio-d707341acfac482d064fc849633df2b1e94a25c2e2b2f925fa9710924931bcc6 WatchSource:0}: Error finding container d707341acfac482d064fc849633df2b1e94a25c2e2b2f925fa9710924931bcc6: Status 404 returned error can't find the container with id d707341acfac482d064fc849633df2b1e94a25c2e2b2f925fa9710924931bcc6 Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.728610 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:10 crc kubenswrapper[4962]: W1003 14:27:10.730715 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701eee88_43d7_4337_b9fc_ebf93df71fd6.slice/crio-f57e2b970a6654b50549ca7402e0158d60aa591417d1e3b9833095d6a37fcf64 WatchSource:0}: Error finding container f57e2b970a6654b50549ca7402e0158d60aa591417d1e3b9833095d6a37fcf64: Status 404 returned error can't find the container with id f57e2b970a6654b50549ca7402e0158d60aa591417d1e3b9833095d6a37fcf64 Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.789412 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.801710 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fc6d855cf-g8lbz"] Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.907537 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9d2p8"] Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.910889 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.913408 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.914278 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.920266 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9d2p8"] Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.956424 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-scripts\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.965549 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg4kf\" (UniqueName: \"kubernetes.io/projected/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-kube-api-access-bg4kf\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.965604 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-config-data\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:10 crc kubenswrapper[4962]: I1003 14:27:10.965821 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.067467 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-scripts\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.067568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg4kf\" (UniqueName: \"kubernetes.io/projected/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-kube-api-access-bg4kf\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.067598 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-config-data\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.067629 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.071346 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-scripts\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.076153 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.076615 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-config-data\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.089123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg4kf\" (UniqueName: \"kubernetes.io/projected/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-kube-api-access-bg4kf\") pod \"nova-cell1-conductor-db-sync-9d2p8\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.250169 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.463044 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb094c37-5f75-414d-816c-0a4af71943a5","Type":"ContainerStarted","Data":"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.463440 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb094c37-5f75-414d-816c-0a4af71943a5","Type":"ContainerStarted","Data":"c70b4483a539dec38035313b3b6bdf11d5121661b2033feb1573ad044b36a4d4"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.467354 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c6c0e02-3da8-4bf8-9b04-8684a07876fa","Type":"ContainerStarted","Data":"b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.467410 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c6c0e02-3da8-4bf8-9b04-8684a07876fa","Type":"ContainerStarted","Data":"d707341acfac482d064fc849633df2b1e94a25c2e2b2f925fa9710924931bcc6"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.482167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb7089a7-df54-44fd-b9b0-59182a38fbd6","Type":"ContainerStarted","Data":"cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.489552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bvvxc" event={"ID":"8608a2df-334e-4b2c-a93d-05276e2afe0f","Type":"ContainerStarted","Data":"dd33081fcdc467722740d05790b9837884bd6833fc620dffc078559f01138d65"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.500282 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.500258348 podStartE2EDuration="2.500258348s" podCreationTimestamp="2025-10-03 14:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:11.488708606 +0000 UTC m=+5839.892606441" watchObservedRunningTime="2025-10-03 14:27:11.500258348 +0000 UTC m=+5839.904156183" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.521589 4962 generic.go:334] "Generic (PLEG): container finished" podID="3077b379-ec45-4f83-9918-da7de38b871c" containerID="d2f4e3c9ce8ee87de1cc861cb2364de1ca8e2f046ac7918dd5152574b4f1de2b" exitCode=0 Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.521714 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" event={"ID":"3077b379-ec45-4f83-9918-da7de38b871c","Type":"ContainerDied","Data":"d2f4e3c9ce8ee87de1cc861cb2364de1ca8e2f046ac7918dd5152574b4f1de2b"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.521761 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" event={"ID":"3077b379-ec45-4f83-9918-da7de38b871c","Type":"ContainerStarted","Data":"753bc90a72db96eb7a32068ba80b3d271355b16683286ba45debc165173d320b"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.541628 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701eee88-43d7-4337-b9fc-ebf93df71fd6","Type":"ContainerStarted","Data":"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.541703 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701eee88-43d7-4337-b9fc-ebf93df71fd6","Type":"ContainerStarted","Data":"f57e2b970a6654b50549ca7402e0158d60aa591417d1e3b9833095d6a37fcf64"} Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.558850 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5588312589999997 podStartE2EDuration="2.558831259s" podCreationTimestamp="2025-10-03 14:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:11.506311036 +0000 UTC m=+5839.910208871" watchObservedRunningTime="2025-10-03 14:27:11.558831259 +0000 UTC m=+5839.962729084" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.576406 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bvvxc" podStartSLOduration=2.576385138 podStartE2EDuration="2.576385138s" podCreationTimestamp="2025-10-03 14:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:11.533314802 +0000 UTC m=+5839.937212637" watchObservedRunningTime="2025-10-03 14:27:11.576385138 +0000 UTC m=+5839.980282963" Oct 03 14:27:11 crc kubenswrapper[4962]: I1003 14:27:11.725797 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9d2p8"] Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.556843 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" event={"ID":"eccc4804-3bd4-428b-9ff6-cd364d7f61b9","Type":"ContainerStarted","Data":"9e997fc2cdcfe42eb6d1e97272d49f7ecb908656cee0906191ba94b1cff9f8b4"} Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.557218 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" event={"ID":"eccc4804-3bd4-428b-9ff6-cd364d7f61b9","Type":"ContainerStarted","Data":"5bfe30d42fb9556272170544d99a9aa03647d18e629121a3bb8cec505dba10e0"} Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.566061 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" event={"ID":"3077b379-ec45-4f83-9918-da7de38b871c","Type":"ContainerStarted","Data":"c776e40f624375deb15523775e764ffd7088a4fff687e80a560aa19c5722fbd4"} Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.566259 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.572571 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701eee88-43d7-4337-b9fc-ebf93df71fd6","Type":"ContainerStarted","Data":"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4"} Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.579795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb094c37-5f75-414d-816c-0a4af71943a5","Type":"ContainerStarted","Data":"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4"} Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.584869 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" podStartSLOduration=2.584841557 podStartE2EDuration="2.584841557s" podCreationTimestamp="2025-10-03 14:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:12.57886205 +0000 UTC m=+5840.982759925" watchObservedRunningTime="2025-10-03 14:27:12.584841557 +0000 UTC m=+5840.988739392" Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.612572 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.612553741 podStartE2EDuration="3.612553741s" podCreationTimestamp="2025-10-03 14:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:12.599263124 +0000 UTC m=+5841.003160969" watchObservedRunningTime="2025-10-03 14:27:12.612553741 +0000 UTC m=+5841.016451576" Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.622147 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" podStartSLOduration=3.622129042 podStartE2EDuration="3.622129042s" podCreationTimestamp="2025-10-03 14:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:12.62054564 +0000 UTC m=+5841.024443475" watchObservedRunningTime="2025-10-03 14:27:12.622129042 +0000 UTC m=+5841.026026877" Oct 03 14:27:12 crc kubenswrapper[4962]: I1003 14:27:12.642878 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.642856604 podStartE2EDuration="3.642856604s" podCreationTimestamp="2025-10-03 14:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:12.637245707 +0000 UTC m=+5841.041143562" watchObservedRunningTime="2025-10-03 14:27:12.642856604 +0000 UTC m=+5841.046754439" Oct 03 14:27:14 crc kubenswrapper[4962]: I1003 14:27:14.603213 4962 generic.go:334] "Generic (PLEG): container finished" podID="eccc4804-3bd4-428b-9ff6-cd364d7f61b9" containerID="9e997fc2cdcfe42eb6d1e97272d49f7ecb908656cee0906191ba94b1cff9f8b4" exitCode=0 Oct 03 14:27:14 crc kubenswrapper[4962]: I1003 14:27:14.603303 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" event={"ID":"eccc4804-3bd4-428b-9ff6-cd364d7f61b9","Type":"ContainerDied","Data":"9e997fc2cdcfe42eb6d1e97272d49f7ecb908656cee0906191ba94b1cff9f8b4"} Oct 03 14:27:14 crc kubenswrapper[4962]: I1003 14:27:14.790885 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 14:27:15 crc kubenswrapper[4962]: I1003 14:27:15.063629 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:27:15 crc kubenswrapper[4962]: I1003 14:27:15.064017 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:27:15 crc kubenswrapper[4962]: I1003 14:27:15.101880 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:15 crc kubenswrapper[4962]: I1003 14:27:15.614420 4962 generic.go:334] "Generic (PLEG): container finished" podID="8608a2df-334e-4b2c-a93d-05276e2afe0f" containerID="dd33081fcdc467722740d05790b9837884bd6833fc620dffc078559f01138d65" exitCode=0 Oct 03 14:27:15 crc kubenswrapper[4962]: I1003 14:27:15.614482 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bvvxc" event={"ID":"8608a2df-334e-4b2c-a93d-05276e2afe0f","Type":"ContainerDied","Data":"dd33081fcdc467722740d05790b9837884bd6833fc620dffc078559f01138d65"} Oct 03 14:27:15 crc kubenswrapper[4962]: I1003 14:27:15.962221 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.073745 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-combined-ca-bundle\") pod \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.073847 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-config-data\") pod \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.073917 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg4kf\" (UniqueName: \"kubernetes.io/projected/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-kube-api-access-bg4kf\") pod \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.074096 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-scripts\") pod \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\" (UID: \"eccc4804-3bd4-428b-9ff6-cd364d7f61b9\") " Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.079226 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-kube-api-access-bg4kf" (OuterVolumeSpecName: "kube-api-access-bg4kf") pod "eccc4804-3bd4-428b-9ff6-cd364d7f61b9" (UID: "eccc4804-3bd4-428b-9ff6-cd364d7f61b9"). InnerVolumeSpecName "kube-api-access-bg4kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.079231 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-scripts" (OuterVolumeSpecName: "scripts") pod "eccc4804-3bd4-428b-9ff6-cd364d7f61b9" (UID: "eccc4804-3bd4-428b-9ff6-cd364d7f61b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.101599 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eccc4804-3bd4-428b-9ff6-cd364d7f61b9" (UID: "eccc4804-3bd4-428b-9ff6-cd364d7f61b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.108833 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-config-data" (OuterVolumeSpecName: "config-data") pod "eccc4804-3bd4-428b-9ff6-cd364d7f61b9" (UID: "eccc4804-3bd4-428b-9ff6-cd364d7f61b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.175751 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg4kf\" (UniqueName: \"kubernetes.io/projected/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-kube-api-access-bg4kf\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.175785 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.175794 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.175803 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eccc4804-3bd4-428b-9ff6-cd364d7f61b9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.625658 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" event={"ID":"eccc4804-3bd4-428b-9ff6-cd364d7f61b9","Type":"ContainerDied","Data":"5bfe30d42fb9556272170544d99a9aa03647d18e629121a3bb8cec505dba10e0"} Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.625686 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9d2p8" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.625701 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bfe30d42fb9556272170544d99a9aa03647d18e629121a3bb8cec505dba10e0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.694715 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:27:16 crc kubenswrapper[4962]: E1003 14:27:16.695172 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eccc4804-3bd4-428b-9ff6-cd364d7f61b9" containerName="nova-cell1-conductor-db-sync" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.695201 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eccc4804-3bd4-428b-9ff6-cd364d7f61b9" containerName="nova-cell1-conductor-db-sync" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.695390 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eccc4804-3bd4-428b-9ff6-cd364d7f61b9" containerName="nova-cell1-conductor-db-sync" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.696055 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.698170 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.703277 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.787665 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.787754 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.787816 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8p6v\" (UniqueName: \"kubernetes.io/projected/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-kube-api-access-f8p6v\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.889400 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.889800 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.889885 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8p6v\" (UniqueName: \"kubernetes.io/projected/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-kube-api-access-f8p6v\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.906883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.906999 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.908855 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8p6v\" (UniqueName: \"kubernetes.io/projected/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-kube-api-access-f8p6v\") pod \"nova-cell1-conductor-0\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:16 crc kubenswrapper[4962]: I1003 14:27:16.985089 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.016377 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.095816 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-scripts\") pod \"8608a2df-334e-4b2c-a93d-05276e2afe0f\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.095902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-config-data\") pod \"8608a2df-334e-4b2c-a93d-05276e2afe0f\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.096184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbckj\" (UniqueName: \"kubernetes.io/projected/8608a2df-334e-4b2c-a93d-05276e2afe0f-kube-api-access-jbckj\") pod \"8608a2df-334e-4b2c-a93d-05276e2afe0f\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.096219 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-combined-ca-bundle\") pod \"8608a2df-334e-4b2c-a93d-05276e2afe0f\" (UID: \"8608a2df-334e-4b2c-a93d-05276e2afe0f\") " Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.100112 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8608a2df-334e-4b2c-a93d-05276e2afe0f-kube-api-access-jbckj" (OuterVolumeSpecName: "kube-api-access-jbckj") pod "8608a2df-334e-4b2c-a93d-05276e2afe0f" (UID: "8608a2df-334e-4b2c-a93d-05276e2afe0f"). InnerVolumeSpecName "kube-api-access-jbckj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.100307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-scripts" (OuterVolumeSpecName: "scripts") pod "8608a2df-334e-4b2c-a93d-05276e2afe0f" (UID: "8608a2df-334e-4b2c-a93d-05276e2afe0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.124866 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8608a2df-334e-4b2c-a93d-05276e2afe0f" (UID: "8608a2df-334e-4b2c-a93d-05276e2afe0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.124938 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-config-data" (OuterVolumeSpecName: "config-data") pod "8608a2df-334e-4b2c-a93d-05276e2afe0f" (UID: "8608a2df-334e-4b2c-a93d-05276e2afe0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.199748 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.200073 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.200090 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbckj\" (UniqueName: \"kubernetes.io/projected/8608a2df-334e-4b2c-a93d-05276e2afe0f-kube-api-access-jbckj\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.200102 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8608a2df-334e-4b2c-a93d-05276e2afe0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:17 crc kubenswrapper[4962]: W1003 14:27:17.440212 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed19d7b_0bc6_42cb_8c2c_8df3cb502c64.slice/crio-30bfbe37ae073421ddec9861ea9d381fc201ce17ffec4af74ecdb929b5ce6609 WatchSource:0}: Error finding container 30bfbe37ae073421ddec9861ea9d381fc201ce17ffec4af74ecdb929b5ce6609: Status 404 returned error can't find the container with id 30bfbe37ae073421ddec9861ea9d381fc201ce17ffec4af74ecdb929b5ce6609 Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.443128 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.635809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bvvxc" event={"ID":"8608a2df-334e-4b2c-a93d-05276e2afe0f","Type":"ContainerDied","Data":"2d92ed08e7fb783c2f0f24a42efb3ad8f0f606d1648ac2220054c4ba471ba438"} Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.635856 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d92ed08e7fb783c2f0f24a42efb3ad8f0f606d1648ac2220054c4ba471ba438" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.635987 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bvvxc" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.638099 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64","Type":"ContainerStarted","Data":"ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93"} Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.638250 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.638268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64","Type":"ContainerStarted","Data":"30bfbe37ae073421ddec9861ea9d381fc201ce17ffec4af74ecdb929b5ce6609"} Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.668988 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.668963473 podStartE2EDuration="1.668963473s" podCreationTimestamp="2025-10-03 14:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:17.652697198 +0000 UTC m=+5846.056595053" watchObservedRunningTime="2025-10-03 14:27:17.668963473 +0000 UTC m=+5846.072861328" Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.807571 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.807929 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" containerName="nova-api-log" containerID="cri-o://45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58" gracePeriod=30 Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.808014 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" containerName="nova-api-api" containerID="cri-o://f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4" gracePeriod=30 Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.816822 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.817026 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fb7089a7-df54-44fd-b9b0-59182a38fbd6" containerName="nova-scheduler-scheduler" containerID="cri-o://cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80" gracePeriod=30 Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.902577 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.902879 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerName="nova-metadata-log" containerID="cri-o://82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808" gracePeriod=30 Oct 03 14:27:17 crc kubenswrapper[4962]: I1003 14:27:17.903040 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerName="nova-metadata-metadata" containerID="cri-o://b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4" gracePeriod=30 Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.336628 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.408115 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.428354 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb094c37-5f75-414d-816c-0a4af71943a5-logs\") pod \"bb094c37-5f75-414d-816c-0a4af71943a5\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.428424 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzp7q\" (UniqueName: \"kubernetes.io/projected/bb094c37-5f75-414d-816c-0a4af71943a5-kube-api-access-vzp7q\") pod \"bb094c37-5f75-414d-816c-0a4af71943a5\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.428473 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-config-data\") pod \"bb094c37-5f75-414d-816c-0a4af71943a5\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.428653 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-combined-ca-bundle\") pod \"bb094c37-5f75-414d-816c-0a4af71943a5\" (UID: \"bb094c37-5f75-414d-816c-0a4af71943a5\") " Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.428711 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb094c37-5f75-414d-816c-0a4af71943a5-logs" (OuterVolumeSpecName: "logs") pod "bb094c37-5f75-414d-816c-0a4af71943a5" (UID: "bb094c37-5f75-414d-816c-0a4af71943a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.429004 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb094c37-5f75-414d-816c-0a4af71943a5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.434505 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb094c37-5f75-414d-816c-0a4af71943a5-kube-api-access-vzp7q" (OuterVolumeSpecName: "kube-api-access-vzp7q") pod "bb094c37-5f75-414d-816c-0a4af71943a5" (UID: "bb094c37-5f75-414d-816c-0a4af71943a5"). InnerVolumeSpecName "kube-api-access-vzp7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.460288 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb094c37-5f75-414d-816c-0a4af71943a5" (UID: "bb094c37-5f75-414d-816c-0a4af71943a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.466378 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-config-data" (OuterVolumeSpecName: "config-data") pod "bb094c37-5f75-414d-816c-0a4af71943a5" (UID: "bb094c37-5f75-414d-816c-0a4af71943a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.529749 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjttd\" (UniqueName: \"kubernetes.io/projected/701eee88-43d7-4337-b9fc-ebf93df71fd6-kube-api-access-hjttd\") pod \"701eee88-43d7-4337-b9fc-ebf93df71fd6\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.529835 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-config-data\") pod \"701eee88-43d7-4337-b9fc-ebf93df71fd6\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.529909 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701eee88-43d7-4337-b9fc-ebf93df71fd6-logs\") pod \"701eee88-43d7-4337-b9fc-ebf93df71fd6\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.530058 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-combined-ca-bundle\") pod \"701eee88-43d7-4337-b9fc-ebf93df71fd6\" (UID: \"701eee88-43d7-4337-b9fc-ebf93df71fd6\") " Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.530433 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.530453 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzp7q\" (UniqueName: \"kubernetes.io/projected/bb094c37-5f75-414d-816c-0a4af71943a5-kube-api-access-vzp7q\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.530441 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701eee88-43d7-4337-b9fc-ebf93df71fd6-logs" (OuterVolumeSpecName: "logs") pod "701eee88-43d7-4337-b9fc-ebf93df71fd6" (UID: "701eee88-43d7-4337-b9fc-ebf93df71fd6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.530465 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb094c37-5f75-414d-816c-0a4af71943a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.532500 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701eee88-43d7-4337-b9fc-ebf93df71fd6-kube-api-access-hjttd" (OuterVolumeSpecName: "kube-api-access-hjttd") pod "701eee88-43d7-4337-b9fc-ebf93df71fd6" (UID: "701eee88-43d7-4337-b9fc-ebf93df71fd6"). InnerVolumeSpecName "kube-api-access-hjttd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.553090 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701eee88-43d7-4337-b9fc-ebf93df71fd6" (UID: "701eee88-43d7-4337-b9fc-ebf93df71fd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.553246 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-config-data" (OuterVolumeSpecName: "config-data") pod "701eee88-43d7-4337-b9fc-ebf93df71fd6" (UID: "701eee88-43d7-4337-b9fc-ebf93df71fd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.632421 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.632460 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjttd\" (UniqueName: \"kubernetes.io/projected/701eee88-43d7-4337-b9fc-ebf93df71fd6-kube-api-access-hjttd\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.632474 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701eee88-43d7-4337-b9fc-ebf93df71fd6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.632485 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/701eee88-43d7-4337-b9fc-ebf93df71fd6-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.646350 4962 generic.go:334] "Generic (PLEG): container finished" podID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerID="b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4" exitCode=0 Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.646391 4962 generic.go:334] "Generic (PLEG): container finished" podID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerID="82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808" exitCode=143 Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.646442 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701eee88-43d7-4337-b9fc-ebf93df71fd6","Type":"ContainerDied","Data":"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4"} Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.646475 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701eee88-43d7-4337-b9fc-ebf93df71fd6","Type":"ContainerDied","Data":"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808"} Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.646491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"701eee88-43d7-4337-b9fc-ebf93df71fd6","Type":"ContainerDied","Data":"f57e2b970a6654b50549ca7402e0158d60aa591417d1e3b9833095d6a37fcf64"} Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.646512 4962 scope.go:117] "RemoveContainer" containerID="b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.646696 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.649191 4962 generic.go:334] "Generic (PLEG): container finished" podID="bb094c37-5f75-414d-816c-0a4af71943a5" containerID="f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4" exitCode=0 Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.649236 4962 generic.go:334] "Generic (PLEG): container finished" podID="bb094c37-5f75-414d-816c-0a4af71943a5" containerID="45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58" exitCode=143 Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.649240 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb094c37-5f75-414d-816c-0a4af71943a5","Type":"ContainerDied","Data":"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4"} Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.649279 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb094c37-5f75-414d-816c-0a4af71943a5","Type":"ContainerDied","Data":"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58"} Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.649287 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.649294 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb094c37-5f75-414d-816c-0a4af71943a5","Type":"ContainerDied","Data":"c70b4483a539dec38035313b3b6bdf11d5121661b2033feb1573ad044b36a4d4"} Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.681262 4962 scope.go:117] "RemoveContainer" containerID="82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.686800 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.697611 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.706648 4962 scope.go:117] "RemoveContainer" containerID="b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.706733 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.707110 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerName="nova-metadata-log" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707125 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerName="nova-metadata-log" Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.707141 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8608a2df-334e-4b2c-a93d-05276e2afe0f" containerName="nova-manage" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707148 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8608a2df-334e-4b2c-a93d-05276e2afe0f" containerName="nova-manage" Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.707161 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" containerName="nova-api-api" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707166 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" containerName="nova-api-api" Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.707177 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerName="nova-metadata-metadata" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707183 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerName="nova-metadata-metadata" Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.707200 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" containerName="nova-api-log" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707208 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" containerName="nova-api-log" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707374 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerName="nova-metadata-log" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707389 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8608a2df-334e-4b2c-a93d-05276e2afe0f" containerName="nova-manage" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707401 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" containerName="nova-api-log" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707414 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" containerName="nova-metadata-metadata" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.707427 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" containerName="nova-api-api" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.708346 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.713579 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.713881 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4\": container with ID starting with b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4 not found: ID does not exist" containerID="b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.713921 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4"} err="failed to get container status \"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4\": rpc error: code = NotFound desc = could not find container \"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4\": container with ID starting with b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4 not found: ID does not exist" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.713956 4962 scope.go:117] "RemoveContainer" containerID="82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808" Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.714738 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808\": container with ID starting with 82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808 not found: ID does not exist" containerID="82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.714767 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808"} err="failed to get container status \"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808\": rpc error: code = NotFound desc = could not find container \"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808\": container with ID starting with 82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808 not found: ID does not exist" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.714782 4962 scope.go:117] "RemoveContainer" containerID="b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.715406 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4"} err="failed to get container status \"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4\": rpc error: code = NotFound desc = could not find container \"b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4\": container with ID starting with b41b6ead49c2c8f9ce8d543666553761dcad9a79da295ea7bf9e55d682d86bf4 not found: ID does not exist" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.715452 4962 scope.go:117] "RemoveContainer" containerID="82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.716113 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808"} err="failed to get container status \"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808\": rpc error: code = NotFound desc = could not find container \"82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808\": container with ID starting with 82844723fd70bc7db233a3e828f103af859638223f267b807b223699ec271808 not found: ID does not exist" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.716140 4962 scope.go:117] "RemoveContainer" containerID="f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.718243 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.729878 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.735329 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.735388 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxdv\" (UniqueName: \"kubernetes.io/projected/21930e51-bd18-4f6f-bb70-8277dca3dd05-kube-api-access-7cxdv\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.735524 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-config-data\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.735564 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21930e51-bd18-4f6f-bb70-8277dca3dd05-logs\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.751552 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.758688 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.760368 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.764168 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.772243 4962 scope.go:117] "RemoveContainer" containerID="45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.783008 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.792050 4962 scope.go:117] "RemoveContainer" containerID="f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4" Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.792465 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4\": container with ID starting with f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4 not found: ID does not exist" containerID="f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.792536 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4"} err="failed to get container status \"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4\": rpc error: code = NotFound desc = could not find container \"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4\": container with ID starting with f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4 not found: ID does not exist" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.792557 4962 scope.go:117] "RemoveContainer" containerID="45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58" Oct 03 14:27:18 crc kubenswrapper[4962]: E1003 14:27:18.792941 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58\": container with ID starting with 45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58 not found: ID does not exist" containerID="45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.792973 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58"} err="failed to get container status \"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58\": rpc error: code = NotFound desc = could not find container \"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58\": container with ID starting with 45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58 not found: ID does not exist" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.792990 4962 scope.go:117] "RemoveContainer" containerID="f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.793174 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4"} err="failed to get container status \"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4\": rpc error: code = NotFound desc = could not find container \"f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4\": container with ID starting with f07c6d76beeae6f98ae1eefa067e80542838b50ea966db8e4ed8672233198ad4 not found: ID does not exist" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.793204 4962 scope.go:117] "RemoveContainer" containerID="45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.793604 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58"} err="failed to get container status \"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58\": rpc error: code = NotFound desc = could not find container \"45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58\": container with ID starting with 45aed1c604b29ed305a6f9ce025334cf683fa5684a98f2628409955502d2af58 not found: ID does not exist" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.836784 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7tm9\" (UniqueName: \"kubernetes.io/projected/5d67733b-acbd-4d3c-9443-6ba06c5825e4-kube-api-access-b7tm9\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.836827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.836852 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxdv\" (UniqueName: \"kubernetes.io/projected/21930e51-bd18-4f6f-bb70-8277dca3dd05-kube-api-access-7cxdv\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.836896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-config-data\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.836959 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-config-data\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.836978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d67733b-acbd-4d3c-9443-6ba06c5825e4-logs\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.837000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21930e51-bd18-4f6f-bb70-8277dca3dd05-logs\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.837019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.837841 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21930e51-bd18-4f6f-bb70-8277dca3dd05-logs\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.839818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.840295 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-config-data\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.858540 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxdv\" (UniqueName: \"kubernetes.io/projected/21930e51-bd18-4f6f-bb70-8277dca3dd05-kube-api-access-7cxdv\") pod \"nova-metadata-0\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " pod="openstack/nova-metadata-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.938934 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7tm9\" (UniqueName: \"kubernetes.io/projected/5d67733b-acbd-4d3c-9443-6ba06c5825e4-kube-api-access-b7tm9\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.939012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-config-data\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.939075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d67733b-acbd-4d3c-9443-6ba06c5825e4-logs\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.939099 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.941015 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d67733b-acbd-4d3c-9443-6ba06c5825e4-logs\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.943065 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.943440 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-config-data\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:18 crc kubenswrapper[4962]: I1003 14:27:18.958516 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7tm9\" (UniqueName: \"kubernetes.io/projected/5d67733b-acbd-4d3c-9443-6ba06c5825e4-kube-api-access-b7tm9\") pod \"nova-api-0\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " pod="openstack/nova-api-0" Oct 03 14:27:19 crc kubenswrapper[4962]: I1003 14:27:19.051219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:19 crc kubenswrapper[4962]: I1003 14:27:19.078871 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:19 crc kubenswrapper[4962]: I1003 14:27:19.486355 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:19 crc kubenswrapper[4962]: W1003 14:27:19.492791 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21930e51_bd18_4f6f_bb70_8277dca3dd05.slice/crio-19691f4444d2156ad4b999536538a94ff91a096e648b5fbf158648f8456fe33f WatchSource:0}: Error finding container 19691f4444d2156ad4b999536538a94ff91a096e648b5fbf158648f8456fe33f: Status 404 returned error can't find the container with id 19691f4444d2156ad4b999536538a94ff91a096e648b5fbf158648f8456fe33f Oct 03 14:27:19 crc kubenswrapper[4962]: I1003 14:27:19.570767 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:19 crc kubenswrapper[4962]: W1003 14:27:19.575591 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d67733b_acbd_4d3c_9443_6ba06c5825e4.slice/crio-f8525a991a689b906696b321cf21d14bec97b246fcb2a8a31fa9fa50fb94d061 WatchSource:0}: Error finding container f8525a991a689b906696b321cf21d14bec97b246fcb2a8a31fa9fa50fb94d061: Status 404 returned error can't find the container with id f8525a991a689b906696b321cf21d14bec97b246fcb2a8a31fa9fa50fb94d061 Oct 03 14:27:19 crc kubenswrapper[4962]: I1003 14:27:19.660406 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21930e51-bd18-4f6f-bb70-8277dca3dd05","Type":"ContainerStarted","Data":"343b73d5fb711ce9f86b08b731accf97b35424d845f434a4e746539315a0e101"} Oct 03 14:27:19 crc kubenswrapper[4962]: I1003 14:27:19.660868 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21930e51-bd18-4f6f-bb70-8277dca3dd05","Type":"ContainerStarted","Data":"19691f4444d2156ad4b999536538a94ff91a096e648b5fbf158648f8456fe33f"} Oct 03 14:27:19 crc kubenswrapper[4962]: I1003 14:27:19.661461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d67733b-acbd-4d3c-9443-6ba06c5825e4","Type":"ContainerStarted","Data":"f8525a991a689b906696b321cf21d14bec97b246fcb2a8a31fa9fa50fb94d061"} Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.101960 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.110769 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.184947 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.227606 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:27:20 crc kubenswrapper[4962]: E1003 14:27:20.227880 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.250300 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701eee88-43d7-4337-b9fc-ebf93df71fd6" path="/var/lib/kubelet/pods/701eee88-43d7-4337-b9fc-ebf93df71fd6/volumes" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.257497 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb094c37-5f75-414d-816c-0a4af71943a5" path="/var/lib/kubelet/pods/bb094c37-5f75-414d-816c-0a4af71943a5/volumes" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.258119 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57665d4b55-knl2t"] Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.258344 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" podUID="766f241c-412b-470d-938c-8785be7fe7ab" containerName="dnsmasq-dns" containerID="cri-o://c49cbfe3abd20040f4005810cb1e8438c9700d23f46975313c122b4ae32695f1" gracePeriod=10 Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.673794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21930e51-bd18-4f6f-bb70-8277dca3dd05","Type":"ContainerStarted","Data":"5b80f5ce6c656b313ed8d263f968936ca7110679e272893263575ec6cc5475f7"} Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.678408 4962 generic.go:334] "Generic (PLEG): container finished" podID="766f241c-412b-470d-938c-8785be7fe7ab" containerID="c49cbfe3abd20040f4005810cb1e8438c9700d23f46975313c122b4ae32695f1" exitCode=0 Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.678479 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" event={"ID":"766f241c-412b-470d-938c-8785be7fe7ab","Type":"ContainerDied","Data":"c49cbfe3abd20040f4005810cb1e8438c9700d23f46975313c122b4ae32695f1"} Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.678515 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" event={"ID":"766f241c-412b-470d-938c-8785be7fe7ab","Type":"ContainerDied","Data":"c12f1043c918273a5ce735840ce268977e37a40051b5bebeb7e09d04413c6aa0"} Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.678525 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c12f1043c918273a5ce735840ce268977e37a40051b5bebeb7e09d04413c6aa0" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.682024 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d67733b-acbd-4d3c-9443-6ba06c5825e4","Type":"ContainerStarted","Data":"7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd"} Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.682084 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d67733b-acbd-4d3c-9443-6ba06c5825e4","Type":"ContainerStarted","Data":"300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75"} Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.694904 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.703459 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.703436236 podStartE2EDuration="2.703436236s" podCreationTimestamp="2025-10-03 14:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:20.694523273 +0000 UTC m=+5849.098421108" watchObservedRunningTime="2025-10-03 14:27:20.703436236 +0000 UTC m=+5849.107334091" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.719428 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.719410334 podStartE2EDuration="2.719410334s" podCreationTimestamp="2025-10-03 14:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:20.716296402 +0000 UTC m=+5849.120194257" watchObservedRunningTime="2025-10-03 14:27:20.719410334 +0000 UTC m=+5849.123308169" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.756156 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.875933 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5k7m\" (UniqueName: \"kubernetes.io/projected/766f241c-412b-470d-938c-8785be7fe7ab-kube-api-access-j5k7m\") pod \"766f241c-412b-470d-938c-8785be7fe7ab\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.875986 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-config\") pod \"766f241c-412b-470d-938c-8785be7fe7ab\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.876027 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-nb\") pod \"766f241c-412b-470d-938c-8785be7fe7ab\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.876049 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-sb\") pod \"766f241c-412b-470d-938c-8785be7fe7ab\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.876306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-dns-svc\") pod \"766f241c-412b-470d-938c-8785be7fe7ab\" (UID: \"766f241c-412b-470d-938c-8785be7fe7ab\") " Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.890879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/766f241c-412b-470d-938c-8785be7fe7ab-kube-api-access-j5k7m" (OuterVolumeSpecName: "kube-api-access-j5k7m") pod "766f241c-412b-470d-938c-8785be7fe7ab" (UID: "766f241c-412b-470d-938c-8785be7fe7ab"). InnerVolumeSpecName "kube-api-access-j5k7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.924509 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "766f241c-412b-470d-938c-8785be7fe7ab" (UID: "766f241c-412b-470d-938c-8785be7fe7ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.927820 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "766f241c-412b-470d-938c-8785be7fe7ab" (UID: "766f241c-412b-470d-938c-8785be7fe7ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.930083 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "766f241c-412b-470d-938c-8785be7fe7ab" (UID: "766f241c-412b-470d-938c-8785be7fe7ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.931538 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-config" (OuterVolumeSpecName: "config") pod "766f241c-412b-470d-938c-8785be7fe7ab" (UID: "766f241c-412b-470d-938c-8785be7fe7ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.978391 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.978431 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5k7m\" (UniqueName: \"kubernetes.io/projected/766f241c-412b-470d-938c-8785be7fe7ab-kube-api-access-j5k7m\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.978452 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.978465 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:20 crc kubenswrapper[4962]: I1003 14:27:20.978476 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/766f241c-412b-470d-938c-8785be7fe7ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:21 crc kubenswrapper[4962]: I1003 14:27:21.692081 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57665d4b55-knl2t" Oct 03 14:27:21 crc kubenswrapper[4962]: I1003 14:27:21.739659 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57665d4b55-knl2t"] Oct 03 14:27:21 crc kubenswrapper[4962]: I1003 14:27:21.749077 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57665d4b55-knl2t"] Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.045545 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.240466 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="766f241c-412b-470d-938c-8785be7fe7ab" path="/var/lib/kubelet/pods/766f241c-412b-470d-938c-8785be7fe7ab/volumes" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.600278 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-lnpxv"] Oct 03 14:27:22 crc kubenswrapper[4962]: E1003 14:27:22.601184 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766f241c-412b-470d-938c-8785be7fe7ab" containerName="dnsmasq-dns" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.601209 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="766f241c-412b-470d-938c-8785be7fe7ab" containerName="dnsmasq-dns" Oct 03 14:27:22 crc kubenswrapper[4962]: E1003 14:27:22.601274 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766f241c-412b-470d-938c-8785be7fe7ab" containerName="init" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.601284 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="766f241c-412b-470d-938c-8785be7fe7ab" containerName="init" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.601547 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="766f241c-412b-470d-938c-8785be7fe7ab" containerName="dnsmasq-dns" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.602344 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.603864 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.616373 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lnpxv"] Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.620263 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.620682 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.704355 4962 generic.go:334] "Generic (PLEG): container finished" podID="fb7089a7-df54-44fd-b9b0-59182a38fbd6" containerID="cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80" exitCode=0 Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.704413 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb7089a7-df54-44fd-b9b0-59182a38fbd6","Type":"ContainerDied","Data":"cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80"} Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.704472 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb7089a7-df54-44fd-b9b0-59182a38fbd6","Type":"ContainerDied","Data":"af1e8682c41a60a1533b861ddf7712d307193c6119f6392d22019fd467c44380"} Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.704504 4962 scope.go:117] "RemoveContainer" containerID="cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.704505 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.709130 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-combined-ca-bundle\") pod \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.709288 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-config-data\") pod \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.709370 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dr7g\" (UniqueName: \"kubernetes.io/projected/fb7089a7-df54-44fd-b9b0-59182a38fbd6-kube-api-access-7dr7g\") pod \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\" (UID: \"fb7089a7-df54-44fd-b9b0-59182a38fbd6\") " Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.709964 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8dnm\" (UniqueName: \"kubernetes.io/projected/0c1b2832-4711-4253-98c5-f8b543b55c80-kube-api-access-t8dnm\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.710033 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-config-data\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.710504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-scripts\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.710738 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.727019 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7089a7-df54-44fd-b9b0-59182a38fbd6-kube-api-access-7dr7g" (OuterVolumeSpecName: "kube-api-access-7dr7g") pod "fb7089a7-df54-44fd-b9b0-59182a38fbd6" (UID: "fb7089a7-df54-44fd-b9b0-59182a38fbd6"). InnerVolumeSpecName "kube-api-access-7dr7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.736350 4962 scope.go:117] "RemoveContainer" containerID="cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80" Oct 03 14:27:22 crc kubenswrapper[4962]: E1003 14:27:22.736843 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80\": container with ID starting with cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80 not found: ID does not exist" containerID="cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.736891 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80"} err="failed to get container status \"cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80\": rpc error: code = NotFound desc = could not find container \"cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80\": container with ID starting with cb42a8981803b840795f696d1cbd0885c56d902616d9db268be5fe92e264ab80 not found: ID does not exist" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.746563 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-config-data" (OuterVolumeSpecName: "config-data") pod "fb7089a7-df54-44fd-b9b0-59182a38fbd6" (UID: "fb7089a7-df54-44fd-b9b0-59182a38fbd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.746920 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb7089a7-df54-44fd-b9b0-59182a38fbd6" (UID: "fb7089a7-df54-44fd-b9b0-59182a38fbd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.813314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8dnm\" (UniqueName: \"kubernetes.io/projected/0c1b2832-4711-4253-98c5-f8b543b55c80-kube-api-access-t8dnm\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.813407 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-config-data\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.813528 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-scripts\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.813597 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.813684 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.815372 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dr7g\" (UniqueName: \"kubernetes.io/projected/fb7089a7-df54-44fd-b9b0-59182a38fbd6-kube-api-access-7dr7g\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.815408 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7089a7-df54-44fd-b9b0-59182a38fbd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.819679 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.820339 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-config-data\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.823563 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-scripts\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.836485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8dnm\" (UniqueName: \"kubernetes.io/projected/0c1b2832-4711-4253-98c5-f8b543b55c80-kube-api-access-t8dnm\") pod \"nova-cell1-cell-mapping-lnpxv\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:22 crc kubenswrapper[4962]: I1003 14:27:22.935139 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.073000 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.103727 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.113100 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:23 crc kubenswrapper[4962]: E1003 14:27:23.113779 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7089a7-df54-44fd-b9b0-59182a38fbd6" containerName="nova-scheduler-scheduler" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.113797 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7089a7-df54-44fd-b9b0-59182a38fbd6" containerName="nova-scheduler-scheduler" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.114038 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7089a7-df54-44fd-b9b0-59182a38fbd6" containerName="nova-scheduler-scheduler" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.115007 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.122018 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.123828 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.223858 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n8zx\" (UniqueName: \"kubernetes.io/projected/30e600e2-b5dc-4982-9021-bc28e2403ef0-kube-api-access-8n8zx\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.224051 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-config-data\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.224095 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.326738 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-config-data\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.326870 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.326937 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n8zx\" (UniqueName: \"kubernetes.io/projected/30e600e2-b5dc-4982-9021-bc28e2403ef0-kube-api-access-8n8zx\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.337505 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.338018 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-config-data\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.345251 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n8zx\" (UniqueName: \"kubernetes.io/projected/30e600e2-b5dc-4982-9021-bc28e2403ef0-kube-api-access-8n8zx\") pod \"nova-scheduler-0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.448195 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.476381 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lnpxv"] Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.718538 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lnpxv" event={"ID":"0c1b2832-4711-4253-98c5-f8b543b55c80","Type":"ContainerStarted","Data":"4030968e65f60914b5a439d72cf60169b10d152c1d3f9d5de04ca20f32f9b0ed"} Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.718602 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lnpxv" event={"ID":"0c1b2832-4711-4253-98c5-f8b543b55c80","Type":"ContainerStarted","Data":"30350ba7467eeb11e365d3ed3d70760105ebff7aacfb088ed44476c9a7b2f15b"} Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.738388 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-lnpxv" podStartSLOduration=1.738369453 podStartE2EDuration="1.738369453s" podCreationTimestamp="2025-10-03 14:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:23.733320201 +0000 UTC m=+5852.137218036" watchObservedRunningTime="2025-10-03 14:27:23.738369453 +0000 UTC m=+5852.142267288" Oct 03 14:27:23 crc kubenswrapper[4962]: W1003 14:27:23.900001 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e600e2_b5dc_4982_9021_bc28e2403ef0.slice/crio-3f877675d33c6570ae620cd383fd352c7237172e18bf200ed6b7813490ef3ee2 WatchSource:0}: Error finding container 3f877675d33c6570ae620cd383fd352c7237172e18bf200ed6b7813490ef3ee2: Status 404 returned error can't find the container with id 3f877675d33c6570ae620cd383fd352c7237172e18bf200ed6b7813490ef3ee2 Oct 03 14:27:23 crc kubenswrapper[4962]: I1003 14:27:23.901714 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:24 crc kubenswrapper[4962]: I1003 14:27:24.051923 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:27:24 crc kubenswrapper[4962]: I1003 14:27:24.051977 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:27:24 crc kubenswrapper[4962]: I1003 14:27:24.247120 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7089a7-df54-44fd-b9b0-59182a38fbd6" path="/var/lib/kubelet/pods/fb7089a7-df54-44fd-b9b0-59182a38fbd6/volumes" Oct 03 14:27:24 crc kubenswrapper[4962]: I1003 14:27:24.750786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30e600e2-b5dc-4982-9021-bc28e2403ef0","Type":"ContainerStarted","Data":"b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0"} Oct 03 14:27:24 crc kubenswrapper[4962]: I1003 14:27:24.751083 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30e600e2-b5dc-4982-9021-bc28e2403ef0","Type":"ContainerStarted","Data":"3f877675d33c6570ae620cd383fd352c7237172e18bf200ed6b7813490ef3ee2"} Oct 03 14:27:24 crc kubenswrapper[4962]: I1003 14:27:24.776821 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.7767997260000001 podStartE2EDuration="1.776799726s" podCreationTimestamp="2025-10-03 14:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:24.772176905 +0000 UTC m=+5853.176074750" watchObservedRunningTime="2025-10-03 14:27:24.776799726 +0000 UTC m=+5853.180697561" Oct 03 14:27:28 crc kubenswrapper[4962]: I1003 14:27:28.448534 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 14:27:28 crc kubenswrapper[4962]: I1003 14:27:28.786295 4962 generic.go:334] "Generic (PLEG): container finished" podID="0c1b2832-4711-4253-98c5-f8b543b55c80" containerID="4030968e65f60914b5a439d72cf60169b10d152c1d3f9d5de04ca20f32f9b0ed" exitCode=0 Oct 03 14:27:28 crc kubenswrapper[4962]: I1003 14:27:28.786337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lnpxv" event={"ID":"0c1b2832-4711-4253-98c5-f8b543b55c80","Type":"ContainerDied","Data":"4030968e65f60914b5a439d72cf60169b10d152c1d3f9d5de04ca20f32f9b0ed"} Oct 03 14:27:29 crc kubenswrapper[4962]: I1003 14:27:29.052512 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:27:29 crc kubenswrapper[4962]: I1003 14:27:29.052796 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:27:29 crc kubenswrapper[4962]: I1003 14:27:29.079752 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:27:29 crc kubenswrapper[4962]: I1003 14:27:29.079907 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.129240 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.134844 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.218829 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.219013 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.218868 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.261099 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-combined-ca-bundle\") pod \"0c1b2832-4711-4253-98c5-f8b543b55c80\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.261192 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-config-data\") pod \"0c1b2832-4711-4253-98c5-f8b543b55c80\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.261369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8dnm\" (UniqueName: \"kubernetes.io/projected/0c1b2832-4711-4253-98c5-f8b543b55c80-kube-api-access-t8dnm\") pod \"0c1b2832-4711-4253-98c5-f8b543b55c80\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.261398 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-scripts\") pod \"0c1b2832-4711-4253-98c5-f8b543b55c80\" (UID: \"0c1b2832-4711-4253-98c5-f8b543b55c80\") " Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.279682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-scripts" (OuterVolumeSpecName: "scripts") pod "0c1b2832-4711-4253-98c5-f8b543b55c80" (UID: "0c1b2832-4711-4253-98c5-f8b543b55c80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.279734 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c1b2832-4711-4253-98c5-f8b543b55c80-kube-api-access-t8dnm" (OuterVolumeSpecName: "kube-api-access-t8dnm") pod "0c1b2832-4711-4253-98c5-f8b543b55c80" (UID: "0c1b2832-4711-4253-98c5-f8b543b55c80"). InnerVolumeSpecName "kube-api-access-t8dnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.291763 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c1b2832-4711-4253-98c5-f8b543b55c80" (UID: "0c1b2832-4711-4253-98c5-f8b543b55c80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.293773 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-config-data" (OuterVolumeSpecName: "config-data") pod "0c1b2832-4711-4253-98c5-f8b543b55c80" (UID: "0c1b2832-4711-4253-98c5-f8b543b55c80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.364065 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8dnm\" (UniqueName: \"kubernetes.io/projected/0c1b2832-4711-4253-98c5-f8b543b55c80-kube-api-access-t8dnm\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.364107 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.364117 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.364126 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1b2832-4711-4253-98c5-f8b543b55c80-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.803170 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lnpxv" event={"ID":"0c1b2832-4711-4253-98c5-f8b543b55c80","Type":"ContainerDied","Data":"30350ba7467eeb11e365d3ed3d70760105ebff7aacfb088ed44476c9a7b2f15b"} Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.803209 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30350ba7467eeb11e365d3ed3d70760105ebff7aacfb088ed44476c9a7b2f15b" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.803215 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lnpxv" Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.985439 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.985708 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="30e600e2-b5dc-4982-9021-bc28e2403ef0" containerName="nova-scheduler-scheduler" containerID="cri-o://b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0" gracePeriod=30 Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.998190 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.998426 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-log" containerID="cri-o://300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75" gracePeriod=30 Oct 03 14:27:30 crc kubenswrapper[4962]: I1003 14:27:30.998551 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-api" containerID="cri-o://7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd" gracePeriod=30 Oct 03 14:27:31 crc kubenswrapper[4962]: I1003 14:27:31.014835 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:31 crc kubenswrapper[4962]: I1003 14:27:31.015240 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-log" containerID="cri-o://343b73d5fb711ce9f86b08b731accf97b35424d845f434a4e746539315a0e101" gracePeriod=30 Oct 03 14:27:31 crc kubenswrapper[4962]: I1003 14:27:31.015308 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-metadata" containerID="cri-o://5b80f5ce6c656b313ed8d263f968936ca7110679e272893263575ec6cc5475f7" gracePeriod=30 Oct 03 14:27:31 crc kubenswrapper[4962]: I1003 14:27:31.814895 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerID="300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75" exitCode=143 Oct 03 14:27:31 crc kubenswrapper[4962]: I1003 14:27:31.814974 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d67733b-acbd-4d3c-9443-6ba06c5825e4","Type":"ContainerDied","Data":"300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75"} Oct 03 14:27:31 crc kubenswrapper[4962]: I1003 14:27:31.818466 4962 generic.go:334] "Generic (PLEG): container finished" podID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerID="343b73d5fb711ce9f86b08b731accf97b35424d845f434a4e746539315a0e101" exitCode=143 Oct 03 14:27:31 crc kubenswrapper[4962]: I1003 14:27:31.818497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21930e51-bd18-4f6f-bb70-8277dca3dd05","Type":"ContainerDied","Data":"343b73d5fb711ce9f86b08b731accf97b35424d845f434a4e746539315a0e101"} Oct 03 14:27:34 crc kubenswrapper[4962]: I1003 14:27:34.843620 4962 generic.go:334] "Generic (PLEG): container finished" podID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerID="5b80f5ce6c656b313ed8d263f968936ca7110679e272893263575ec6cc5475f7" exitCode=0 Oct 03 14:27:34 crc kubenswrapper[4962]: I1003 14:27:34.843826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21930e51-bd18-4f6f-bb70-8277dca3dd05","Type":"ContainerDied","Data":"5b80f5ce6c656b313ed8d263f968936ca7110679e272893263575ec6cc5475f7"} Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.227564 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.228240 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.525766 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.648676 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.666274 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21930e51-bd18-4f6f-bb70-8277dca3dd05-logs\") pod \"21930e51-bd18-4f6f-bb70-8277dca3dd05\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.666489 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-combined-ca-bundle\") pod \"21930e51-bd18-4f6f-bb70-8277dca3dd05\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.666605 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-config-data\") pod \"21930e51-bd18-4f6f-bb70-8277dca3dd05\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.666810 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cxdv\" (UniqueName: \"kubernetes.io/projected/21930e51-bd18-4f6f-bb70-8277dca3dd05-kube-api-access-7cxdv\") pod \"21930e51-bd18-4f6f-bb70-8277dca3dd05\" (UID: \"21930e51-bd18-4f6f-bb70-8277dca3dd05\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.669341 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21930e51-bd18-4f6f-bb70-8277dca3dd05-logs" (OuterVolumeSpecName: "logs") pod "21930e51-bd18-4f6f-bb70-8277dca3dd05" (UID: "21930e51-bd18-4f6f-bb70-8277dca3dd05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.679154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21930e51-bd18-4f6f-bb70-8277dca3dd05-kube-api-access-7cxdv" (OuterVolumeSpecName: "kube-api-access-7cxdv") pod "21930e51-bd18-4f6f-bb70-8277dca3dd05" (UID: "21930e51-bd18-4f6f-bb70-8277dca3dd05"). InnerVolumeSpecName "kube-api-access-7cxdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.701761 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21930e51-bd18-4f6f-bb70-8277dca3dd05" (UID: "21930e51-bd18-4f6f-bb70-8277dca3dd05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.704655 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-config-data" (OuterVolumeSpecName: "config-data") pod "21930e51-bd18-4f6f-bb70-8277dca3dd05" (UID: "21930e51-bd18-4f6f-bb70-8277dca3dd05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.769592 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n8zx\" (UniqueName: \"kubernetes.io/projected/30e600e2-b5dc-4982-9021-bc28e2403ef0-kube-api-access-8n8zx\") pod \"30e600e2-b5dc-4982-9021-bc28e2403ef0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.769709 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-config-data\") pod \"30e600e2-b5dc-4982-9021-bc28e2403ef0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.769785 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-combined-ca-bundle\") pod \"30e600e2-b5dc-4982-9021-bc28e2403ef0\" (UID: \"30e600e2-b5dc-4982-9021-bc28e2403ef0\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.770095 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cxdv\" (UniqueName: \"kubernetes.io/projected/21930e51-bd18-4f6f-bb70-8277dca3dd05-kube-api-access-7cxdv\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.770115 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21930e51-bd18-4f6f-bb70-8277dca3dd05-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.770124 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.770133 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21930e51-bd18-4f6f-bb70-8277dca3dd05-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.772622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e600e2-b5dc-4982-9021-bc28e2403ef0-kube-api-access-8n8zx" (OuterVolumeSpecName: "kube-api-access-8n8zx") pod "30e600e2-b5dc-4982-9021-bc28e2403ef0" (UID: "30e600e2-b5dc-4982-9021-bc28e2403ef0"). InnerVolumeSpecName "kube-api-access-8n8zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.778076 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.802830 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-config-data" (OuterVolumeSpecName: "config-data") pod "30e600e2-b5dc-4982-9021-bc28e2403ef0" (UID: "30e600e2-b5dc-4982-9021-bc28e2403ef0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.807687 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30e600e2-b5dc-4982-9021-bc28e2403ef0" (UID: "30e600e2-b5dc-4982-9021-bc28e2403ef0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.853590 4962 generic.go:334] "Generic (PLEG): container finished" podID="30e600e2-b5dc-4982-9021-bc28e2403ef0" containerID="b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0" exitCode=0 Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.853665 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.853664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30e600e2-b5dc-4982-9021-bc28e2403ef0","Type":"ContainerDied","Data":"b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0"} Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.854217 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30e600e2-b5dc-4982-9021-bc28e2403ef0","Type":"ContainerDied","Data":"3f877675d33c6570ae620cd383fd352c7237172e18bf200ed6b7813490ef3ee2"} Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.854255 4962 scope.go:117] "RemoveContainer" containerID="b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.859376 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerID="7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd" exitCode=0 Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.859444 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d67733b-acbd-4d3c-9443-6ba06c5825e4","Type":"ContainerDied","Data":"7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd"} Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.859479 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d67733b-acbd-4d3c-9443-6ba06c5825e4","Type":"ContainerDied","Data":"f8525a991a689b906696b321cf21d14bec97b246fcb2a8a31fa9fa50fb94d061"} Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.859556 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.865464 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21930e51-bd18-4f6f-bb70-8277dca3dd05","Type":"ContainerDied","Data":"19691f4444d2156ad4b999536538a94ff91a096e648b5fbf158648f8456fe33f"} Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.865564 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.871995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-config-data\") pod \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.872102 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d67733b-acbd-4d3c-9443-6ba06c5825e4-logs\") pod \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.872158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-combined-ca-bundle\") pod \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.872389 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7tm9\" (UniqueName: \"kubernetes.io/projected/5d67733b-acbd-4d3c-9443-6ba06c5825e4-kube-api-access-b7tm9\") pod \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\" (UID: \"5d67733b-acbd-4d3c-9443-6ba06c5825e4\") " Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.872884 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.872905 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e600e2-b5dc-4982-9021-bc28e2403ef0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.872922 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n8zx\" (UniqueName: \"kubernetes.io/projected/30e600e2-b5dc-4982-9021-bc28e2403ef0-kube-api-access-8n8zx\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.873088 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d67733b-acbd-4d3c-9443-6ba06c5825e4-logs" (OuterVolumeSpecName: "logs") pod "5d67733b-acbd-4d3c-9443-6ba06c5825e4" (UID: "5d67733b-acbd-4d3c-9443-6ba06c5825e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.879282 4962 scope.go:117] "RemoveContainer" containerID="b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.884621 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0\": container with ID starting with b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0 not found: ID does not exist" containerID="b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.884682 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0"} err="failed to get container status \"b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0\": rpc error: code = NotFound desc = could not find container \"b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0\": container with ID starting with b4d06888d9dcdad2cb8e1089e6f6e3360f678607c15ea3c71503da5d8d6261a0 not found: ID does not exist" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.884712 4962 scope.go:117] "RemoveContainer" containerID="7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.890093 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d67733b-acbd-4d3c-9443-6ba06c5825e4-kube-api-access-b7tm9" (OuterVolumeSpecName: "kube-api-access-b7tm9") pod "5d67733b-acbd-4d3c-9443-6ba06c5825e4" (UID: "5d67733b-acbd-4d3c-9443-6ba06c5825e4"). InnerVolumeSpecName "kube-api-access-b7tm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.899740 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.918625 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-config-data" (OuterVolumeSpecName: "config-data") pod "5d67733b-acbd-4d3c-9443-6ba06c5825e4" (UID: "5d67733b-acbd-4d3c-9443-6ba06c5825e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.919356 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.940086 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d67733b-acbd-4d3c-9443-6ba06c5825e4" (UID: "5d67733b-acbd-4d3c-9443-6ba06c5825e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.944200 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.945155 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-log" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945175 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-log" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.945202 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-log" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945208 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-log" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.945222 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e600e2-b5dc-4982-9021-bc28e2403ef0" containerName="nova-scheduler-scheduler" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945229 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e600e2-b5dc-4982-9021-bc28e2403ef0" containerName="nova-scheduler-scheduler" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.945258 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-metadata" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945264 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-metadata" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.945274 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1b2832-4711-4253-98c5-f8b543b55c80" containerName="nova-manage" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945281 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1b2832-4711-4253-98c5-f8b543b55c80" containerName="nova-manage" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.945295 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-api" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945303 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-api" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945490 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-metadata" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945536 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e600e2-b5dc-4982-9021-bc28e2403ef0" containerName="nova-scheduler-scheduler" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945553 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-api" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945563 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" containerName="nova-metadata-log" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945583 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c1b2832-4711-4253-98c5-f8b543b55c80" containerName="nova-manage" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.945590 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" containerName="nova-api-log" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.947995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.950152 4962 scope.go:117] "RemoveContainer" containerID="300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.958724 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.961085 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.973989 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.976015 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.976064 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d67733b-acbd-4d3c-9443-6ba06c5825e4-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.976078 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d67733b-acbd-4d3c-9443-6ba06c5825e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.976091 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7tm9\" (UniqueName: \"kubernetes.io/projected/5d67733b-acbd-4d3c-9443-6ba06c5825e4-kube-api-access-b7tm9\") on node \"crc\" DevicePath \"\"" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.979576 4962 scope.go:117] "RemoveContainer" containerID="7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.980308 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd\": container with ID starting with 7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd not found: ID does not exist" containerID="7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.980381 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd"} err="failed to get container status \"7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd\": rpc error: code = NotFound desc = could not find container \"7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd\": container with ID starting with 7ad35815a4a1dbc0652a689021f72d85864f21ac5862414d8ee6d49bffc2f9bd not found: ID does not exist" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.980416 4962 scope.go:117] "RemoveContainer" containerID="300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75" Oct 03 14:27:35 crc kubenswrapper[4962]: E1003 14:27:35.980883 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75\": container with ID starting with 300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75 not found: ID does not exist" containerID="300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.980923 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75"} err="failed to get container status \"300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75\": rpc error: code = NotFound desc = could not find container \"300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75\": container with ID starting with 300ad0d7d6280c24fa1eb14967ae551766141bb1d1f90fce5738ae77ded00b75 not found: ID does not exist" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.980957 4962 scope.go:117] "RemoveContainer" containerID="5b80f5ce6c656b313ed8d263f968936ca7110679e272893263575ec6cc5475f7" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.984853 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.992997 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.994993 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:35 crc kubenswrapper[4962]: I1003 14:27:35.997079 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.012729 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.013304 4962 scope.go:117] "RemoveContainer" containerID="343b73d5fb711ce9f86b08b731accf97b35424d845f434a4e746539315a0e101" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.077151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d11367-1fa5-472a-8046-6e1719195a2f-logs\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.077198 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.077243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-config-data\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.077293 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.077318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl298\" (UniqueName: \"kubernetes.io/projected/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-kube-api-access-pl298\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.077459 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-config-data\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.077477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfzzp\" (UniqueName: \"kubernetes.io/projected/a0d11367-1fa5-472a-8046-6e1719195a2f-kube-api-access-qfzzp\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.179461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-config-data\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.179524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.179561 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl298\" (UniqueName: \"kubernetes.io/projected/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-kube-api-access-pl298\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.179606 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-config-data\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.179627 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfzzp\" (UniqueName: \"kubernetes.io/projected/a0d11367-1fa5-472a-8046-6e1719195a2f-kube-api-access-qfzzp\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.179712 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d11367-1fa5-472a-8046-6e1719195a2f-logs\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.179729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.180173 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d11367-1fa5-472a-8046-6e1719195a2f-logs\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.184261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-config-data\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.184730 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-config-data\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.185360 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.185360 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.197270 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl298\" (UniqueName: \"kubernetes.io/projected/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-kube-api-access-pl298\") pod \"nova-scheduler-0\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.199784 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfzzp\" (UniqueName: \"kubernetes.io/projected/a0d11367-1fa5-472a-8046-6e1719195a2f-kube-api-access-qfzzp\") pod \"nova-metadata-0\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.211019 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.222927 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.238679 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21930e51-bd18-4f6f-bb70-8277dca3dd05" path="/var/lib/kubelet/pods/21930e51-bd18-4f6f-bb70-8277dca3dd05/volumes" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.239727 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e600e2-b5dc-4982-9021-bc28e2403ef0" path="/var/lib/kubelet/pods/30e600e2-b5dc-4982-9021-bc28e2403ef0/volumes" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.240356 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d67733b-acbd-4d3c-9443-6ba06c5825e4" path="/var/lib/kubelet/pods/5d67733b-acbd-4d3c-9443-6ba06c5825e4/volumes" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.252863 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.254335 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.254426 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.256379 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.280651 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.318273 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.383384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.384400 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01f82f4-f27f-4da3-83f7-ac88ca54d880-logs\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.384437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9chd\" (UniqueName: \"kubernetes.io/projected/f01f82f4-f27f-4da3-83f7-ac88ca54d880-kube-api-access-q9chd\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.384504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-config-data\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.486173 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01f82f4-f27f-4da3-83f7-ac88ca54d880-logs\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.486524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9chd\" (UniqueName: \"kubernetes.io/projected/f01f82f4-f27f-4da3-83f7-ac88ca54d880-kube-api-access-q9chd\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.486570 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-config-data\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.486680 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01f82f4-f27f-4da3-83f7-ac88ca54d880-logs\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.486763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.492164 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-config-data\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.493199 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.510130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9chd\" (UniqueName: \"kubernetes.io/projected/f01f82f4-f27f-4da3-83f7-ac88ca54d880-kube-api-access-q9chd\") pod \"nova-api-0\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.570015 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.706290 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:27:36 crc kubenswrapper[4962]: W1003 14:27:36.737672 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04b5ed01_2476_408b_b9d8_ff1ef2ad5923.slice/crio-295b352c921d3fe5c2a53877ac136cced22eb277485cbc910392670a2975cdc9 WatchSource:0}: Error finding container 295b352c921d3fe5c2a53877ac136cced22eb277485cbc910392670a2975cdc9: Status 404 returned error can't find the container with id 295b352c921d3fe5c2a53877ac136cced22eb277485cbc910392670a2975cdc9 Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.919075 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:27:36 crc kubenswrapper[4962]: I1003 14:27:36.944621 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b5ed01-2476-408b-b9d8-ff1ef2ad5923","Type":"ContainerStarted","Data":"295b352c921d3fe5c2a53877ac136cced22eb277485cbc910392670a2975cdc9"} Oct 03 14:27:37 crc kubenswrapper[4962]: W1003 14:27:37.116645 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01f82f4_f27f_4da3_83f7_ac88ca54d880.slice/crio-b4dbb04ce8bcb292491db4ff9a8665903f8c37c05b245d78b6c6d5a839072985 WatchSource:0}: Error finding container b4dbb04ce8bcb292491db4ff9a8665903f8c37c05b245d78b6c6d5a839072985: Status 404 returned error can't find the container with id b4dbb04ce8bcb292491db4ff9a8665903f8c37c05b245d78b6c6d5a839072985 Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.117404 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.955941 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b5ed01-2476-408b-b9d8-ff1ef2ad5923","Type":"ContainerStarted","Data":"80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e"} Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.960253 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0d11367-1fa5-472a-8046-6e1719195a2f","Type":"ContainerStarted","Data":"2a0a8a9f417b3e291db7d3ec635c7e770ebcb120600d244e09183e67a60b53ed"} Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.960314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0d11367-1fa5-472a-8046-6e1719195a2f","Type":"ContainerStarted","Data":"b46a9e8932e8f60d3bbcf46554828a868eeaa55aadf37646ed5908c2fe339b8d"} Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.960326 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0d11367-1fa5-472a-8046-6e1719195a2f","Type":"ContainerStarted","Data":"61a9e2665b94e240620984b209272020cebeebe94403a767a46406a817c19322"} Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.962338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f01f82f4-f27f-4da3-83f7-ac88ca54d880","Type":"ContainerStarted","Data":"4b9d0922d2d307ae481566520c5ad7f9198f1b50f2a3e0ca42d2bd70b25ef450"} Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.962368 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f01f82f4-f27f-4da3-83f7-ac88ca54d880","Type":"ContainerStarted","Data":"65da81c5418996928420f6548ad1d29bc95105e66db97ee2295873d32175a4d0"} Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.962378 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f01f82f4-f27f-4da3-83f7-ac88ca54d880","Type":"ContainerStarted","Data":"b4dbb04ce8bcb292491db4ff9a8665903f8c37c05b245d78b6c6d5a839072985"} Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.980216 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.980200579 podStartE2EDuration="2.980200579s" podCreationTimestamp="2025-10-03 14:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:37.972013065 +0000 UTC m=+5866.375910900" watchObservedRunningTime="2025-10-03 14:27:37.980200579 +0000 UTC m=+5866.384098414" Oct 03 14:27:37 crc kubenswrapper[4962]: I1003 14:27:37.999907 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.9998885739999999 podStartE2EDuration="1.999888574s" podCreationTimestamp="2025-10-03 14:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:37.994545024 +0000 UTC m=+5866.398442869" watchObservedRunningTime="2025-10-03 14:27:37.999888574 +0000 UTC m=+5866.403786409" Oct 03 14:27:38 crc kubenswrapper[4962]: I1003 14:27:38.022279 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.022245228 podStartE2EDuration="3.022245228s" podCreationTimestamp="2025-10-03 14:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:27:38.018842919 +0000 UTC m=+5866.422740764" watchObservedRunningTime="2025-10-03 14:27:38.022245228 +0000 UTC m=+5866.426143063" Oct 03 14:27:41 crc kubenswrapper[4962]: I1003 14:27:41.281222 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 14:27:41 crc kubenswrapper[4962]: I1003 14:27:41.319268 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:27:41 crc kubenswrapper[4962]: I1003 14:27:41.319333 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:27:46 crc kubenswrapper[4962]: I1003 14:27:46.281126 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 14:27:46 crc kubenswrapper[4962]: I1003 14:27:46.307349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 14:27:46 crc kubenswrapper[4962]: I1003 14:27:46.319256 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:27:46 crc kubenswrapper[4962]: I1003 14:27:46.319322 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:27:46 crc kubenswrapper[4962]: I1003 14:27:46.570901 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:27:46 crc kubenswrapper[4962]: I1003 14:27:46.571218 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:27:47 crc kubenswrapper[4962]: I1003 14:27:47.074080 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 14:27:47 crc kubenswrapper[4962]: I1003 14:27:47.401859 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:27:47 crc kubenswrapper[4962]: I1003 14:27:47.401865 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:27:47 crc kubenswrapper[4962]: I1003 14:27:47.653833 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:27:47 crc kubenswrapper[4962]: I1003 14:27:47.653906 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:27:50 crc kubenswrapper[4962]: I1003 14:27:50.227039 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:27:50 crc kubenswrapper[4962]: E1003 14:27:50.227723 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:27:56 crc kubenswrapper[4962]: I1003 14:27:56.322669 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 14:27:56 crc kubenswrapper[4962]: I1003 14:27:56.323816 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 14:27:56 crc kubenswrapper[4962]: I1003 14:27:56.329299 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 14:27:56 crc kubenswrapper[4962]: I1003 14:27:56.329393 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 14:27:56 crc kubenswrapper[4962]: I1003 14:27:56.575840 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 14:27:56 crc kubenswrapper[4962]: I1003 14:27:56.576615 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 14:27:56 crc kubenswrapper[4962]: I1003 14:27:56.581282 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 14:27:56 crc kubenswrapper[4962]: I1003 14:27:56.587479 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.135854 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.139778 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.350604 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5f667f5-54n26"] Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.359136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.369149 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5f667f5-54n26"] Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.530040 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79q7s\" (UniqueName: \"kubernetes.io/projected/146da477-8524-42f2-8a41-d104885147db-kube-api-access-79q7s\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.530107 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.530147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-dns-svc\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.530171 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.530197 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-config\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.632197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.632268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-dns-svc\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.632294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.632326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-config\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.632418 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79q7s\" (UniqueName: \"kubernetes.io/projected/146da477-8524-42f2-8a41-d104885147db-kube-api-access-79q7s\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.633696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-dns-svc\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.633811 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.634018 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-config\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.634397 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.655157 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79q7s\" (UniqueName: \"kubernetes.io/projected/146da477-8524-42f2-8a41-d104885147db-kube-api-access-79q7s\") pod \"dnsmasq-dns-7d5f667f5-54n26\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:57 crc kubenswrapper[4962]: I1003 14:27:57.698357 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:27:58 crc kubenswrapper[4962]: I1003 14:27:58.242585 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5f667f5-54n26"] Oct 03 14:27:59 crc kubenswrapper[4962]: I1003 14:27:59.155972 4962 generic.go:334] "Generic (PLEG): container finished" podID="146da477-8524-42f2-8a41-d104885147db" containerID="b2a120ce016b4e1ff1c72b31cb479e33eecde2f14859b2b5317c090a4e6be976" exitCode=0 Oct 03 14:27:59 crc kubenswrapper[4962]: I1003 14:27:59.156042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" event={"ID":"146da477-8524-42f2-8a41-d104885147db","Type":"ContainerDied","Data":"b2a120ce016b4e1ff1c72b31cb479e33eecde2f14859b2b5317c090a4e6be976"} Oct 03 14:27:59 crc kubenswrapper[4962]: I1003 14:27:59.156585 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" event={"ID":"146da477-8524-42f2-8a41-d104885147db","Type":"ContainerStarted","Data":"c84edafe295c18357c04a10f387b10d68dfb3a46d63969b1c1cf5edc8b0a9b93"} Oct 03 14:28:00 crc kubenswrapper[4962]: I1003 14:28:00.166627 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" event={"ID":"146da477-8524-42f2-8a41-d104885147db","Type":"ContainerStarted","Data":"41f5a8c1e42028f6ccf01c15c51f81c2abaad0ace3913fef4c86fb929c802224"} Oct 03 14:28:00 crc kubenswrapper[4962]: I1003 14:28:00.167258 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:28:00 crc kubenswrapper[4962]: I1003 14:28:00.186207 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" podStartSLOduration=3.186188755 podStartE2EDuration="3.186188755s" podCreationTimestamp="2025-10-03 14:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:00.181168314 +0000 UTC m=+5888.585066179" watchObservedRunningTime="2025-10-03 14:28:00.186188755 +0000 UTC m=+5888.590086590" Oct 03 14:28:05 crc kubenswrapper[4962]: I1003 14:28:05.227126 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:28:05 crc kubenswrapper[4962]: E1003 14:28:05.227973 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:28:07 crc kubenswrapper[4962]: I1003 14:28:07.700822 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:28:07 crc kubenswrapper[4962]: I1003 14:28:07.766954 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fc6d855cf-g8lbz"] Oct 03 14:28:07 crc kubenswrapper[4962]: I1003 14:28:07.767182 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" podUID="3077b379-ec45-4f83-9918-da7de38b871c" containerName="dnsmasq-dns" containerID="cri-o://c776e40f624375deb15523775e764ffd7088a4fff687e80a560aa19c5722fbd4" gracePeriod=10 Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.256790 4962 generic.go:334] "Generic (PLEG): container finished" podID="3077b379-ec45-4f83-9918-da7de38b871c" containerID="c776e40f624375deb15523775e764ffd7088a4fff687e80a560aa19c5722fbd4" exitCode=0 Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.256842 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" event={"ID":"3077b379-ec45-4f83-9918-da7de38b871c","Type":"ContainerDied","Data":"c776e40f624375deb15523775e764ffd7088a4fff687e80a560aa19c5722fbd4"} Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.257140 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" event={"ID":"3077b379-ec45-4f83-9918-da7de38b871c","Type":"ContainerDied","Data":"753bc90a72db96eb7a32068ba80b3d271355b16683286ba45debc165173d320b"} Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.257155 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="753bc90a72db96eb7a32068ba80b3d271355b16683286ba45debc165173d320b" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.315578 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.441918 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-sb\") pod \"3077b379-ec45-4f83-9918-da7de38b871c\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.442023 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-config\") pod \"3077b379-ec45-4f83-9918-da7de38b871c\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.442236 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2tzc\" (UniqueName: \"kubernetes.io/projected/3077b379-ec45-4f83-9918-da7de38b871c-kube-api-access-c2tzc\") pod \"3077b379-ec45-4f83-9918-da7de38b871c\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.442361 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-dns-svc\") pod \"3077b379-ec45-4f83-9918-da7de38b871c\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.442434 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-nb\") pod \"3077b379-ec45-4f83-9918-da7de38b871c\" (UID: \"3077b379-ec45-4f83-9918-da7de38b871c\") " Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.450842 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3077b379-ec45-4f83-9918-da7de38b871c-kube-api-access-c2tzc" (OuterVolumeSpecName: "kube-api-access-c2tzc") pod "3077b379-ec45-4f83-9918-da7de38b871c" (UID: "3077b379-ec45-4f83-9918-da7de38b871c"). InnerVolumeSpecName "kube-api-access-c2tzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.494280 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3077b379-ec45-4f83-9918-da7de38b871c" (UID: "3077b379-ec45-4f83-9918-da7de38b871c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.502126 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-config" (OuterVolumeSpecName: "config") pod "3077b379-ec45-4f83-9918-da7de38b871c" (UID: "3077b379-ec45-4f83-9918-da7de38b871c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.504810 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3077b379-ec45-4f83-9918-da7de38b871c" (UID: "3077b379-ec45-4f83-9918-da7de38b871c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.516705 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3077b379-ec45-4f83-9918-da7de38b871c" (UID: "3077b379-ec45-4f83-9918-da7de38b871c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.544854 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.544894 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.544909 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.544921 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3077b379-ec45-4f83-9918-da7de38b871c-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:08 crc kubenswrapper[4962]: I1003 14:28:08.544934 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2tzc\" (UniqueName: \"kubernetes.io/projected/3077b379-ec45-4f83-9918-da7de38b871c-kube-api-access-c2tzc\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:09 crc kubenswrapper[4962]: I1003 14:28:09.265660 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fc6d855cf-g8lbz" Oct 03 14:28:09 crc kubenswrapper[4962]: I1003 14:28:09.304483 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fc6d855cf-g8lbz"] Oct 03 14:28:09 crc kubenswrapper[4962]: I1003 14:28:09.311567 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fc6d855cf-g8lbz"] Oct 03 14:28:10 crc kubenswrapper[4962]: I1003 14:28:10.237878 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3077b379-ec45-4f83-9918-da7de38b871c" path="/var/lib/kubelet/pods/3077b379-ec45-4f83-9918-da7de38b871c/volumes" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.077987 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6q9fz"] Oct 03 14:28:11 crc kubenswrapper[4962]: E1003 14:28:11.078678 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3077b379-ec45-4f83-9918-da7de38b871c" containerName="dnsmasq-dns" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.078692 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3077b379-ec45-4f83-9918-da7de38b871c" containerName="dnsmasq-dns" Oct 03 14:28:11 crc kubenswrapper[4962]: E1003 14:28:11.078737 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3077b379-ec45-4f83-9918-da7de38b871c" containerName="init" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.078744 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3077b379-ec45-4f83-9918-da7de38b871c" containerName="init" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.078905 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3077b379-ec45-4f83-9918-da7de38b871c" containerName="dnsmasq-dns" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.079502 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6q9fz" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.095265 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6q9fz"] Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.102541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m62jr\" (UniqueName: \"kubernetes.io/projected/98032380-e6f6-4472-b4e5-f21afd8f78d6-kube-api-access-m62jr\") pod \"cinder-db-create-6q9fz\" (UID: \"98032380-e6f6-4472-b4e5-f21afd8f78d6\") " pod="openstack/cinder-db-create-6q9fz" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.204667 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m62jr\" (UniqueName: \"kubernetes.io/projected/98032380-e6f6-4472-b4e5-f21afd8f78d6-kube-api-access-m62jr\") pod \"cinder-db-create-6q9fz\" (UID: \"98032380-e6f6-4472-b4e5-f21afd8f78d6\") " pod="openstack/cinder-db-create-6q9fz" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.226696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m62jr\" (UniqueName: \"kubernetes.io/projected/98032380-e6f6-4472-b4e5-f21afd8f78d6-kube-api-access-m62jr\") pod \"cinder-db-create-6q9fz\" (UID: \"98032380-e6f6-4472-b4e5-f21afd8f78d6\") " pod="openstack/cinder-db-create-6q9fz" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.397170 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6q9fz" Oct 03 14:28:11 crc kubenswrapper[4962]: I1003 14:28:11.876752 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6q9fz"] Oct 03 14:28:12 crc kubenswrapper[4962]: I1003 14:28:12.290608 4962 generic.go:334] "Generic (PLEG): container finished" podID="98032380-e6f6-4472-b4e5-f21afd8f78d6" containerID="e09dd6270db59d7ac784dd1277c11c7cfecc7d85e9ad03ce28c2d6e5fdfc80c2" exitCode=0 Oct 03 14:28:12 crc kubenswrapper[4962]: I1003 14:28:12.291003 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6q9fz" event={"ID":"98032380-e6f6-4472-b4e5-f21afd8f78d6","Type":"ContainerDied","Data":"e09dd6270db59d7ac784dd1277c11c7cfecc7d85e9ad03ce28c2d6e5fdfc80c2"} Oct 03 14:28:12 crc kubenswrapper[4962]: I1003 14:28:12.291078 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6q9fz" event={"ID":"98032380-e6f6-4472-b4e5-f21afd8f78d6","Type":"ContainerStarted","Data":"35daf27115e43a8c303543971b4d3367d4c2339904d20c295a3ee3575a2b2a23"} Oct 03 14:28:13 crc kubenswrapper[4962]: I1003 14:28:13.599928 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6q9fz" Oct 03 14:28:13 crc kubenswrapper[4962]: I1003 14:28:13.756262 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m62jr\" (UniqueName: \"kubernetes.io/projected/98032380-e6f6-4472-b4e5-f21afd8f78d6-kube-api-access-m62jr\") pod \"98032380-e6f6-4472-b4e5-f21afd8f78d6\" (UID: \"98032380-e6f6-4472-b4e5-f21afd8f78d6\") " Oct 03 14:28:13 crc kubenswrapper[4962]: I1003 14:28:13.762088 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98032380-e6f6-4472-b4e5-f21afd8f78d6-kube-api-access-m62jr" (OuterVolumeSpecName: "kube-api-access-m62jr") pod "98032380-e6f6-4472-b4e5-f21afd8f78d6" (UID: "98032380-e6f6-4472-b4e5-f21afd8f78d6"). InnerVolumeSpecName "kube-api-access-m62jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:13 crc kubenswrapper[4962]: I1003 14:28:13.858428 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m62jr\" (UniqueName: \"kubernetes.io/projected/98032380-e6f6-4472-b4e5-f21afd8f78d6-kube-api-access-m62jr\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:14 crc kubenswrapper[4962]: I1003 14:28:14.316074 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6q9fz" event={"ID":"98032380-e6f6-4472-b4e5-f21afd8f78d6","Type":"ContainerDied","Data":"35daf27115e43a8c303543971b4d3367d4c2339904d20c295a3ee3575a2b2a23"} Oct 03 14:28:14 crc kubenswrapper[4962]: I1003 14:28:14.316149 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6q9fz" Oct 03 14:28:14 crc kubenswrapper[4962]: I1003 14:28:14.316147 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35daf27115e43a8c303543971b4d3367d4c2339904d20c295a3ee3575a2b2a23" Oct 03 14:28:17 crc kubenswrapper[4962]: I1003 14:28:17.226879 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:28:17 crc kubenswrapper[4962]: E1003 14:28:17.227541 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.170841 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5c1b-account-create-ntkjx"] Oct 03 14:28:21 crc kubenswrapper[4962]: E1003 14:28:21.171729 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98032380-e6f6-4472-b4e5-f21afd8f78d6" containerName="mariadb-database-create" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.171748 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="98032380-e6f6-4472-b4e5-f21afd8f78d6" containerName="mariadb-database-create" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.171986 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="98032380-e6f6-4472-b4e5-f21afd8f78d6" containerName="mariadb-database-create" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.173939 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5c1b-account-create-ntkjx" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.187321 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5c1b-account-create-ntkjx"] Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.191061 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.287460 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r7ps\" (UniqueName: \"kubernetes.io/projected/c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836-kube-api-access-5r7ps\") pod \"cinder-5c1b-account-create-ntkjx\" (UID: \"c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836\") " pod="openstack/cinder-5c1b-account-create-ntkjx" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.389161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r7ps\" (UniqueName: \"kubernetes.io/projected/c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836-kube-api-access-5r7ps\") pod \"cinder-5c1b-account-create-ntkjx\" (UID: \"c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836\") " pod="openstack/cinder-5c1b-account-create-ntkjx" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.409081 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r7ps\" (UniqueName: \"kubernetes.io/projected/c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836-kube-api-access-5r7ps\") pod \"cinder-5c1b-account-create-ntkjx\" (UID: \"c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836\") " pod="openstack/cinder-5c1b-account-create-ntkjx" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.506581 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5c1b-account-create-ntkjx" Oct 03 14:28:21 crc kubenswrapper[4962]: I1003 14:28:21.959434 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5c1b-account-create-ntkjx"] Oct 03 14:28:22 crc kubenswrapper[4962]: I1003 14:28:22.396487 4962 generic.go:334] "Generic (PLEG): container finished" podID="c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836" containerID="4bdecd72f95dab8313827b86da82e6252270febdfce63429a42e19046f358360" exitCode=0 Oct 03 14:28:22 crc kubenswrapper[4962]: I1003 14:28:22.396676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5c1b-account-create-ntkjx" event={"ID":"c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836","Type":"ContainerDied","Data":"4bdecd72f95dab8313827b86da82e6252270febdfce63429a42e19046f358360"} Oct 03 14:28:22 crc kubenswrapper[4962]: I1003 14:28:22.396837 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5c1b-account-create-ntkjx" event={"ID":"c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836","Type":"ContainerStarted","Data":"7ccc4f2cc3acf931d8967c010dde070aa6e3fea38ac17e59e2566d1aff463a08"} Oct 03 14:28:23 crc kubenswrapper[4962]: I1003 14:28:23.729560 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5c1b-account-create-ntkjx" Oct 03 14:28:23 crc kubenswrapper[4962]: I1003 14:28:23.832996 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r7ps\" (UniqueName: \"kubernetes.io/projected/c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836-kube-api-access-5r7ps\") pod \"c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836\" (UID: \"c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836\") " Oct 03 14:28:23 crc kubenswrapper[4962]: I1003 14:28:23.838506 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836-kube-api-access-5r7ps" (OuterVolumeSpecName: "kube-api-access-5r7ps") pod "c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836" (UID: "c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836"). InnerVolumeSpecName "kube-api-access-5r7ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:23 crc kubenswrapper[4962]: I1003 14:28:23.935490 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r7ps\" (UniqueName: \"kubernetes.io/projected/c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836-kube-api-access-5r7ps\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:24 crc kubenswrapper[4962]: I1003 14:28:24.414374 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5c1b-account-create-ntkjx" event={"ID":"c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836","Type":"ContainerDied","Data":"7ccc4f2cc3acf931d8967c010dde070aa6e3fea38ac17e59e2566d1aff463a08"} Oct 03 14:28:24 crc kubenswrapper[4962]: I1003 14:28:24.414415 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ccc4f2cc3acf931d8967c010dde070aa6e3fea38ac17e59e2566d1aff463a08" Oct 03 14:28:24 crc kubenswrapper[4962]: I1003 14:28:24.414436 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5c1b-account-create-ntkjx" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.300442 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mm8m8"] Oct 03 14:28:26 crc kubenswrapper[4962]: E1003 14:28:26.301151 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836" containerName="mariadb-account-create" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.301166 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836" containerName="mariadb-account-create" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.302074 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836" containerName="mariadb-account-create" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.302757 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.306735 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.306778 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hqtcq" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.306735 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.321691 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mm8m8"] Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.497456 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-scripts\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.498058 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-config-data\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.498095 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-combined-ca-bundle\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.498115 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-db-sync-config-data\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.498135 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54553a51-c0a4-445e-9415-e3e7373c56a7-etc-machine-id\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.498162 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4fq2\" (UniqueName: \"kubernetes.io/projected/54553a51-c0a4-445e-9415-e3e7373c56a7-kube-api-access-q4fq2\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.599851 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-scripts\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.599971 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-config-data\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.600011 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-combined-ca-bundle\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.600038 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-db-sync-config-data\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.600063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54553a51-c0a4-445e-9415-e3e7373c56a7-etc-machine-id\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.600098 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4fq2\" (UniqueName: \"kubernetes.io/projected/54553a51-c0a4-445e-9415-e3e7373c56a7-kube-api-access-q4fq2\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.600462 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54553a51-c0a4-445e-9415-e3e7373c56a7-etc-machine-id\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.608777 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-db-sync-config-data\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.608822 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-combined-ca-bundle\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.613314 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-config-data\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.614069 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-scripts\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.617253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4fq2\" (UniqueName: \"kubernetes.io/projected/54553a51-c0a4-445e-9415-e3e7373c56a7-kube-api-access-q4fq2\") pod \"cinder-db-sync-mm8m8\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:26 crc kubenswrapper[4962]: I1003 14:28:26.626422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:27 crc kubenswrapper[4962]: I1003 14:28:27.082192 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mm8m8"] Oct 03 14:28:27 crc kubenswrapper[4962]: I1003 14:28:27.443428 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mm8m8" event={"ID":"54553a51-c0a4-445e-9415-e3e7373c56a7","Type":"ContainerStarted","Data":"b102e513bc300ea8ac9535f1d111bccc6b36f6691cb35f5e37127d046615487e"} Oct 03 14:28:28 crc kubenswrapper[4962]: I1003 14:28:28.453947 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mm8m8" event={"ID":"54553a51-c0a4-445e-9415-e3e7373c56a7","Type":"ContainerStarted","Data":"9067176d1c9eca3683cb7bdced5aa755b678a6cb65168410343eec3901ce1df9"} Oct 03 14:28:28 crc kubenswrapper[4962]: I1003 14:28:28.472279 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mm8m8" podStartSLOduration=2.47225574 podStartE2EDuration="2.47225574s" podCreationTimestamp="2025-10-03 14:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:28.469036086 +0000 UTC m=+5916.872933921" watchObservedRunningTime="2025-10-03 14:28:28.47225574 +0000 UTC m=+5916.876153565" Oct 03 14:28:30 crc kubenswrapper[4962]: I1003 14:28:30.227838 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:28:30 crc kubenswrapper[4962]: E1003 14:28:30.228818 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:28:30 crc kubenswrapper[4962]: I1003 14:28:30.475578 4962 generic.go:334] "Generic (PLEG): container finished" podID="54553a51-c0a4-445e-9415-e3e7373c56a7" containerID="9067176d1c9eca3683cb7bdced5aa755b678a6cb65168410343eec3901ce1df9" exitCode=0 Oct 03 14:28:30 crc kubenswrapper[4962]: I1003 14:28:30.475622 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mm8m8" event={"ID":"54553a51-c0a4-445e-9415-e3e7373c56a7","Type":"ContainerDied","Data":"9067176d1c9eca3683cb7bdced5aa755b678a6cb65168410343eec3901ce1df9"} Oct 03 14:28:31 crc kubenswrapper[4962]: I1003 14:28:31.849106 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:31.999663 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-config-data\") pod \"54553a51-c0a4-445e-9415-e3e7373c56a7\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.000379 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54553a51-c0a4-445e-9415-e3e7373c56a7-etc-machine-id\") pod \"54553a51-c0a4-445e-9415-e3e7373c56a7\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.000411 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-db-sync-config-data\") pod \"54553a51-c0a4-445e-9415-e3e7373c56a7\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.000484 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4fq2\" (UniqueName: \"kubernetes.io/projected/54553a51-c0a4-445e-9415-e3e7373c56a7-kube-api-access-q4fq2\") pod \"54553a51-c0a4-445e-9415-e3e7373c56a7\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.000521 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-combined-ca-bundle\") pod \"54553a51-c0a4-445e-9415-e3e7373c56a7\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.000533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54553a51-c0a4-445e-9415-e3e7373c56a7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "54553a51-c0a4-445e-9415-e3e7373c56a7" (UID: "54553a51-c0a4-445e-9415-e3e7373c56a7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.000585 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-scripts\") pod \"54553a51-c0a4-445e-9415-e3e7373c56a7\" (UID: \"54553a51-c0a4-445e-9415-e3e7373c56a7\") " Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.001173 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54553a51-c0a4-445e-9415-e3e7373c56a7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.028788 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "54553a51-c0a4-445e-9415-e3e7373c56a7" (UID: "54553a51-c0a4-445e-9415-e3e7373c56a7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.028967 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-scripts" (OuterVolumeSpecName: "scripts") pod "54553a51-c0a4-445e-9415-e3e7373c56a7" (UID: "54553a51-c0a4-445e-9415-e3e7373c56a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.041827 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54553a51-c0a4-445e-9415-e3e7373c56a7-kube-api-access-q4fq2" (OuterVolumeSpecName: "kube-api-access-q4fq2") pod "54553a51-c0a4-445e-9415-e3e7373c56a7" (UID: "54553a51-c0a4-445e-9415-e3e7373c56a7"). InnerVolumeSpecName "kube-api-access-q4fq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.042318 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54553a51-c0a4-445e-9415-e3e7373c56a7" (UID: "54553a51-c0a4-445e-9415-e3e7373c56a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.081819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-config-data" (OuterVolumeSpecName: "config-data") pod "54553a51-c0a4-445e-9415-e3e7373c56a7" (UID: "54553a51-c0a4-445e-9415-e3e7373c56a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.103033 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.103085 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.103101 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4fq2\" (UniqueName: \"kubernetes.io/projected/54553a51-c0a4-445e-9415-e3e7373c56a7-kube-api-access-q4fq2\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.103112 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.103122 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54553a51-c0a4-445e-9415-e3e7373c56a7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.502360 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mm8m8" event={"ID":"54553a51-c0a4-445e-9415-e3e7373c56a7","Type":"ContainerDied","Data":"b102e513bc300ea8ac9535f1d111bccc6b36f6691cb35f5e37127d046615487e"} Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.502405 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b102e513bc300ea8ac9535f1d111bccc6b36f6691cb35f5e37127d046615487e" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.502405 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mm8m8" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.828020 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699498fcb9-wgjvx"] Oct 03 14:28:32 crc kubenswrapper[4962]: E1003 14:28:32.828777 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54553a51-c0a4-445e-9415-e3e7373c56a7" containerName="cinder-db-sync" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.828808 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54553a51-c0a4-445e-9415-e3e7373c56a7" containerName="cinder-db-sync" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.829088 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="54553a51-c0a4-445e-9415-e3e7373c56a7" containerName="cinder-db-sync" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.830851 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.857461 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699498fcb9-wgjvx"] Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.918529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-sb\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.918629 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-nb\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.918689 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-dns-svc\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.918774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-config\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:32 crc kubenswrapper[4962]: I1003 14:28:32.918802 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvhd\" (UniqueName: \"kubernetes.io/projected/2b7423e1-7cb7-456a-aaf3-011b6795240d-kube-api-access-4jvhd\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.015367 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.017368 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.019421 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.019801 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.019965 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.020133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-nb\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.020170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-dns-svc\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.020227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-config\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.020257 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hqtcq" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.020256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvhd\" (UniqueName: \"kubernetes.io/projected/2b7423e1-7cb7-456a-aaf3-011b6795240d-kube-api-access-4jvhd\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.020565 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-sb\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.021774 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-sb\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.022234 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-dns-svc\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.022278 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-nb\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.022775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-config\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.038017 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.048223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvhd\" (UniqueName: \"kubernetes.io/projected/2b7423e1-7cb7-456a-aaf3-011b6795240d-kube-api-access-4jvhd\") pod \"dnsmasq-dns-699498fcb9-wgjvx\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.121750 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmmcv\" (UniqueName: \"kubernetes.io/projected/7b9af1df-7637-45ab-98ee-85241bf8826a-kube-api-access-zmmcv\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.122146 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data-custom\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.122202 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b9af1df-7637-45ab-98ee-85241bf8826a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.122228 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9af1df-7637-45ab-98ee-85241bf8826a-logs\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.122258 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-scripts\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.122305 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.122353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.158833 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.224076 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b9af1df-7637-45ab-98ee-85241bf8826a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.224132 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9af1df-7637-45ab-98ee-85241bf8826a-logs\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.224161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-scripts\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.224210 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.224261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.224285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmmcv\" (UniqueName: \"kubernetes.io/projected/7b9af1df-7637-45ab-98ee-85241bf8826a-kube-api-access-zmmcv\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.224314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data-custom\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.226898 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b9af1df-7637-45ab-98ee-85241bf8826a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.227236 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9af1df-7637-45ab-98ee-85241bf8826a-logs\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.229797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-scripts\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.230711 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data-custom\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.233666 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.234921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.247581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmmcv\" (UniqueName: \"kubernetes.io/projected/7b9af1df-7637-45ab-98ee-85241bf8826a-kube-api-access-zmmcv\") pod \"cinder-api-0\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.334479 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:28:33 crc kubenswrapper[4962]: I1003 14:28:33.535440 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699498fcb9-wgjvx"] Oct 03 14:28:34 crc kubenswrapper[4962]: I1003 14:28:34.011778 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:28:34 crc kubenswrapper[4962]: W1003 14:28:34.031116 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b9af1df_7637_45ab_98ee_85241bf8826a.slice/crio-44da2cd601e3ecb3fea59c80335d18096753ed120bf40a0ba468df22f4b230a8 WatchSource:0}: Error finding container 44da2cd601e3ecb3fea59c80335d18096753ed120bf40a0ba468df22f4b230a8: Status 404 returned error can't find the container with id 44da2cd601e3ecb3fea59c80335d18096753ed120bf40a0ba468df22f4b230a8 Oct 03 14:28:34 crc kubenswrapper[4962]: I1003 14:28:34.550955 4962 generic.go:334] "Generic (PLEG): container finished" podID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerID="bcadc39a9449b21064b23ef1501c63a320c3d5fcbea108fd322458cafb40e85d" exitCode=0 Oct 03 14:28:34 crc kubenswrapper[4962]: I1003 14:28:34.551056 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" event={"ID":"2b7423e1-7cb7-456a-aaf3-011b6795240d","Type":"ContainerDied","Data":"bcadc39a9449b21064b23ef1501c63a320c3d5fcbea108fd322458cafb40e85d"} Oct 03 14:28:34 crc kubenswrapper[4962]: I1003 14:28:34.551384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" event={"ID":"2b7423e1-7cb7-456a-aaf3-011b6795240d","Type":"ContainerStarted","Data":"0af41f5ee0742003c6de5107c24c3f1c2113762fab195694cbd2b2221f97e14b"} Oct 03 14:28:34 crc kubenswrapper[4962]: I1003 14:28:34.554082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b9af1df-7637-45ab-98ee-85241bf8826a","Type":"ContainerStarted","Data":"44da2cd601e3ecb3fea59c80335d18096753ed120bf40a0ba468df22f4b230a8"} Oct 03 14:28:35 crc kubenswrapper[4962]: I1003 14:28:35.564414 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" event={"ID":"2b7423e1-7cb7-456a-aaf3-011b6795240d","Type":"ContainerStarted","Data":"fcda40a21a7008ba001fad062163bb3582f05f2d8f2afe32740fb8b3fc040fe3"} Oct 03 14:28:35 crc kubenswrapper[4962]: I1003 14:28:35.565091 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:35 crc kubenswrapper[4962]: I1003 14:28:35.568201 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b9af1df-7637-45ab-98ee-85241bf8826a","Type":"ContainerStarted","Data":"6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f"} Oct 03 14:28:35 crc kubenswrapper[4962]: I1003 14:28:35.568236 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b9af1df-7637-45ab-98ee-85241bf8826a","Type":"ContainerStarted","Data":"d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f"} Oct 03 14:28:35 crc kubenswrapper[4962]: I1003 14:28:35.568932 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 14:28:35 crc kubenswrapper[4962]: I1003 14:28:35.594740 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" podStartSLOduration=3.594718323 podStartE2EDuration="3.594718323s" podCreationTimestamp="2025-10-03 14:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:35.591366725 +0000 UTC m=+5923.995264560" watchObservedRunningTime="2025-10-03 14:28:35.594718323 +0000 UTC m=+5923.998616158" Oct 03 14:28:35 crc kubenswrapper[4962]: I1003 14:28:35.612012 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.611989324 podStartE2EDuration="3.611989324s" podCreationTimestamp="2025-10-03 14:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:35.605550116 +0000 UTC m=+5924.009447971" watchObservedRunningTime="2025-10-03 14:28:35.611989324 +0000 UTC m=+5924.015887159" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.161926 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.218496 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5f667f5-54n26"] Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.218750 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" podUID="146da477-8524-42f2-8a41-d104885147db" containerName="dnsmasq-dns" containerID="cri-o://41f5a8c1e42028f6ccf01c15c51f81c2abaad0ace3913fef4c86fb929c802224" gracePeriod=10 Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.229707 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:28:43 crc kubenswrapper[4962]: E1003 14:28:43.229955 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.657022 4962 generic.go:334] "Generic (PLEG): container finished" podID="146da477-8524-42f2-8a41-d104885147db" containerID="41f5a8c1e42028f6ccf01c15c51f81c2abaad0ace3913fef4c86fb929c802224" exitCode=0 Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.657484 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" event={"ID":"146da477-8524-42f2-8a41-d104885147db","Type":"ContainerDied","Data":"41f5a8c1e42028f6ccf01c15c51f81c2abaad0ace3913fef4c86fb929c802224"} Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.657516 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" event={"ID":"146da477-8524-42f2-8a41-d104885147db","Type":"ContainerDied","Data":"c84edafe295c18357c04a10f387b10d68dfb3a46d63969b1c1cf5edc8b0a9b93"} Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.657529 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c84edafe295c18357c04a10f387b10d68dfb3a46d63969b1c1cf5edc8b0a9b93" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.705600 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.859805 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79q7s\" (UniqueName: \"kubernetes.io/projected/146da477-8524-42f2-8a41-d104885147db-kube-api-access-79q7s\") pod \"146da477-8524-42f2-8a41-d104885147db\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.859864 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-dns-svc\") pod \"146da477-8524-42f2-8a41-d104885147db\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.859893 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-nb\") pod \"146da477-8524-42f2-8a41-d104885147db\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.859920 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-config\") pod \"146da477-8524-42f2-8a41-d104885147db\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.859940 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-sb\") pod \"146da477-8524-42f2-8a41-d104885147db\" (UID: \"146da477-8524-42f2-8a41-d104885147db\") " Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.868314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146da477-8524-42f2-8a41-d104885147db-kube-api-access-79q7s" (OuterVolumeSpecName: "kube-api-access-79q7s") pod "146da477-8524-42f2-8a41-d104885147db" (UID: "146da477-8524-42f2-8a41-d104885147db"). InnerVolumeSpecName "kube-api-access-79q7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.907753 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "146da477-8524-42f2-8a41-d104885147db" (UID: "146da477-8524-42f2-8a41-d104885147db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.922776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-config" (OuterVolumeSpecName: "config") pod "146da477-8524-42f2-8a41-d104885147db" (UID: "146da477-8524-42f2-8a41-d104885147db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.926437 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "146da477-8524-42f2-8a41-d104885147db" (UID: "146da477-8524-42f2-8a41-d104885147db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.943477 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "146da477-8524-42f2-8a41-d104885147db" (UID: "146da477-8524-42f2-8a41-d104885147db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.962705 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79q7s\" (UniqueName: \"kubernetes.io/projected/146da477-8524-42f2-8a41-d104885147db-kube-api-access-79q7s\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.962742 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.962752 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.962760 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:43 crc kubenswrapper[4962]: I1003 14:28:43.962768 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/146da477-8524-42f2-8a41-d104885147db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:44 crc kubenswrapper[4962]: I1003 14:28:44.669097 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5f667f5-54n26" Oct 03 14:28:44 crc kubenswrapper[4962]: I1003 14:28:44.696039 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5f667f5-54n26"] Oct 03 14:28:44 crc kubenswrapper[4962]: I1003 14:28:44.702589 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5f667f5-54n26"] Oct 03 14:28:45 crc kubenswrapper[4962]: I1003 14:28:45.425327 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.037818 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.038058 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="04b5ed01-2476-408b-b9d8-ff1ef2ad5923" containerName="nova-scheduler-scheduler" containerID="cri-o://80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e" gracePeriod=30 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.045478 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.045829 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="8361c760-4e3d-46d1-b1ad-73826855e693" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a392e7f454efaf115628bada22dcfbb70d683b083922bfef80689ff4bec558f7" gracePeriod=30 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.057327 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.057557 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-log" containerID="cri-o://b46a9e8932e8f60d3bbcf46554828a868eeaa55aadf37646ed5908c2fe339b8d" gracePeriod=30 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.058005 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-metadata" containerID="cri-o://2a0a8a9f417b3e291db7d3ec635c7e770ebcb120600d244e09183e67a60b53ed" gracePeriod=30 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.068550 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.068810 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-log" containerID="cri-o://65da81c5418996928420f6548ad1d29bc95105e66db97ee2295873d32175a4d0" gracePeriod=30 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.069260 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-api" containerID="cri-o://4b9d0922d2d307ae481566520c5ad7f9198f1b50f2a3e0ca42d2bd70b25ef450" gracePeriod=30 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.083555 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.083801 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2c6c0e02-3da8-4bf8-9b04-8684a07876fa" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca" gracePeriod=30 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.239520 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146da477-8524-42f2-8a41-d104885147db" path="/var/lib/kubelet/pods/146da477-8524-42f2-8a41-d104885147db/volumes" Oct 03 14:28:46 crc kubenswrapper[4962]: E1003 14:28:46.284897 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:28:46 crc kubenswrapper[4962]: E1003 14:28:46.286548 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:28:46 crc kubenswrapper[4962]: E1003 14:28:46.288492 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:28:46 crc kubenswrapper[4962]: E1003 14:28:46.288525 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="04b5ed01-2476-408b-b9d8-ff1ef2ad5923" containerName="nova-scheduler-scheduler" Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.693343 4962 generic.go:334] "Generic (PLEG): container finished" podID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerID="b46a9e8932e8f60d3bbcf46554828a868eeaa55aadf37646ed5908c2fe339b8d" exitCode=143 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.693442 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0d11367-1fa5-472a-8046-6e1719195a2f","Type":"ContainerDied","Data":"b46a9e8932e8f60d3bbcf46554828a868eeaa55aadf37646ed5908c2fe339b8d"} Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.695446 4962 generic.go:334] "Generic (PLEG): container finished" podID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerID="65da81c5418996928420f6548ad1d29bc95105e66db97ee2295873d32175a4d0" exitCode=143 Oct 03 14:28:46 crc kubenswrapper[4962]: I1003 14:28:46.695489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f01f82f4-f27f-4da3-83f7-ac88ca54d880","Type":"ContainerDied","Data":"65da81c5418996928420f6548ad1d29bc95105e66db97ee2295873d32175a4d0"} Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.405088 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.542608 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfmzj\" (UniqueName: \"kubernetes.io/projected/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-kube-api-access-tfmzj\") pod \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.543457 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-combined-ca-bundle\") pod \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.543558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-config-data\") pod \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\" (UID: \"2c6c0e02-3da8-4bf8-9b04-8684a07876fa\") " Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.567386 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-kube-api-access-tfmzj" (OuterVolumeSpecName: "kube-api-access-tfmzj") pod "2c6c0e02-3da8-4bf8-9b04-8684a07876fa" (UID: "2c6c0e02-3da8-4bf8-9b04-8684a07876fa"). InnerVolumeSpecName "kube-api-access-tfmzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.577968 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c6c0e02-3da8-4bf8-9b04-8684a07876fa" (UID: "2c6c0e02-3da8-4bf8-9b04-8684a07876fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.580356 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-config-data" (OuterVolumeSpecName: "config-data") pod "2c6c0e02-3da8-4bf8-9b04-8684a07876fa" (UID: "2c6c0e02-3da8-4bf8-9b04-8684a07876fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.645840 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfmzj\" (UniqueName: \"kubernetes.io/projected/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-kube-api-access-tfmzj\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.645888 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.645901 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6c0e02-3da8-4bf8-9b04-8684a07876fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.713135 4962 generic.go:334] "Generic (PLEG): container finished" podID="8361c760-4e3d-46d1-b1ad-73826855e693" containerID="a392e7f454efaf115628bada22dcfbb70d683b083922bfef80689ff4bec558f7" exitCode=0 Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.713231 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8361c760-4e3d-46d1-b1ad-73826855e693","Type":"ContainerDied","Data":"a392e7f454efaf115628bada22dcfbb70d683b083922bfef80689ff4bec558f7"} Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.715431 4962 generic.go:334] "Generic (PLEG): container finished" podID="2c6c0e02-3da8-4bf8-9b04-8684a07876fa" containerID="b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca" exitCode=0 Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.715482 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c6c0e02-3da8-4bf8-9b04-8684a07876fa","Type":"ContainerDied","Data":"b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca"} Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.715515 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c6c0e02-3da8-4bf8-9b04-8684a07876fa","Type":"ContainerDied","Data":"d707341acfac482d064fc849633df2b1e94a25c2e2b2f925fa9710924931bcc6"} Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.715541 4962 scope.go:117] "RemoveContainer" containerID="b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.715738 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.760839 4962 scope.go:117] "RemoveContainer" containerID="b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca" Oct 03 14:28:47 crc kubenswrapper[4962]: E1003 14:28:47.761779 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca\": container with ID starting with b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca not found: ID does not exist" containerID="b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.761823 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca"} err="failed to get container status \"b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca\": rpc error: code = NotFound desc = could not find container \"b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca\": container with ID starting with b741ee8462471ff407f58668c709daae8948307c3812af0158ad8c2880da2eca not found: ID does not exist" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.787375 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.799717 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.813740 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:28:47 crc kubenswrapper[4962]: E1003 14:28:47.814446 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146da477-8524-42f2-8a41-d104885147db" containerName="init" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.814465 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="146da477-8524-42f2-8a41-d104885147db" containerName="init" Oct 03 14:28:47 crc kubenswrapper[4962]: E1003 14:28:47.814495 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6c0e02-3da8-4bf8-9b04-8684a07876fa" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.814503 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6c0e02-3da8-4bf8-9b04-8684a07876fa" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 14:28:47 crc kubenswrapper[4962]: E1003 14:28:47.814531 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146da477-8524-42f2-8a41-d104885147db" containerName="dnsmasq-dns" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.814540 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="146da477-8524-42f2-8a41-d104885147db" containerName="dnsmasq-dns" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.814774 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="146da477-8524-42f2-8a41-d104885147db" containerName="dnsmasq-dns" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.814800 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6c0e02-3da8-4bf8-9b04-8684a07876fa" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.815771 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.824537 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.842080 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.956240 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3dd24d-75e3-422a-ac9a-efeee6404f74-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.956339 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3dd24d-75e3-422a-ac9a-efeee6404f74-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:47 crc kubenswrapper[4962]: I1003 14:28:47.956395 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nzwd\" (UniqueName: \"kubernetes.io/projected/6c3dd24d-75e3-422a-ac9a-efeee6404f74-kube-api-access-6nzwd\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.060023 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3dd24d-75e3-422a-ac9a-efeee6404f74-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.060089 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3dd24d-75e3-422a-ac9a-efeee6404f74-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.060176 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nzwd\" (UniqueName: \"kubernetes.io/projected/6c3dd24d-75e3-422a-ac9a-efeee6404f74-kube-api-access-6nzwd\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.060873 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.065796 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3dd24d-75e3-422a-ac9a-efeee6404f74-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.067532 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3dd24d-75e3-422a-ac9a-efeee6404f74-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.078681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nzwd\" (UniqueName: \"kubernetes.io/projected/6c3dd24d-75e3-422a-ac9a-efeee6404f74-kube-api-access-6nzwd\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c3dd24d-75e3-422a-ac9a-efeee6404f74\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.163955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-combined-ca-bundle\") pod \"8361c760-4e3d-46d1-b1ad-73826855e693\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.164580 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b9jg\" (UniqueName: \"kubernetes.io/projected/8361c760-4e3d-46d1-b1ad-73826855e693-kube-api-access-5b9jg\") pod \"8361c760-4e3d-46d1-b1ad-73826855e693\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.164737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-config-data\") pod \"8361c760-4e3d-46d1-b1ad-73826855e693\" (UID: \"8361c760-4e3d-46d1-b1ad-73826855e693\") " Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.166115 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.168218 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8361c760-4e3d-46d1-b1ad-73826855e693-kube-api-access-5b9jg" (OuterVolumeSpecName: "kube-api-access-5b9jg") pod "8361c760-4e3d-46d1-b1ad-73826855e693" (UID: "8361c760-4e3d-46d1-b1ad-73826855e693"). InnerVolumeSpecName "kube-api-access-5b9jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.187977 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-config-data" (OuterVolumeSpecName: "config-data") pod "8361c760-4e3d-46d1-b1ad-73826855e693" (UID: "8361c760-4e3d-46d1-b1ad-73826855e693"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.212321 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8361c760-4e3d-46d1-b1ad-73826855e693" (UID: "8361c760-4e3d-46d1-b1ad-73826855e693"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.243214 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6c0e02-3da8-4bf8-9b04-8684a07876fa" path="/var/lib/kubelet/pods/2c6c0e02-3da8-4bf8-9b04-8684a07876fa/volumes" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.268162 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b9jg\" (UniqueName: \"kubernetes.io/projected/8361c760-4e3d-46d1-b1ad-73826855e693-kube-api-access-5b9jg\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.268570 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.268634 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8361c760-4e3d-46d1-b1ad-73826855e693-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.644020 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.754817 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6c3dd24d-75e3-422a-ac9a-efeee6404f74","Type":"ContainerStarted","Data":"13eedeac1f1eace8438d5d136a5707c49b2c8406455eb908ef3ffabcb3bf1c24"} Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.758176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8361c760-4e3d-46d1-b1ad-73826855e693","Type":"ContainerDied","Data":"461e5f8c9f9577b5cd331f0dbb825914569f58c56f394cfa53f8ede1f0ec3137"} Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.758262 4962 scope.go:117] "RemoveContainer" containerID="a392e7f454efaf115628bada22dcfbb70d683b083922bfef80689ff4bec558f7" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.758271 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.838133 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.846999 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.869287 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:28:48 crc kubenswrapper[4962]: E1003 14:28:48.869871 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8361c760-4e3d-46d1-b1ad-73826855e693" containerName="nova-cell0-conductor-conductor" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.869894 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8361c760-4e3d-46d1-b1ad-73826855e693" containerName="nova-cell0-conductor-conductor" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.870186 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8361c760-4e3d-46d1-b1ad-73826855e693" containerName="nova-cell0-conductor-conductor" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.871058 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.878583 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.884903 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.987784 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq2nq\" (UniqueName: \"kubernetes.io/projected/a1914699-3c0e-42ac-b63e-7df14da9703d-kube-api-access-rq2nq\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.987843 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1914699-3c0e-42ac-b63e-7df14da9703d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:48 crc kubenswrapper[4962]: I1003 14:28:48.988011 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1914699-3c0e-42ac-b63e-7df14da9703d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.089892 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq2nq\" (UniqueName: \"kubernetes.io/projected/a1914699-3c0e-42ac-b63e-7df14da9703d-kube-api-access-rq2nq\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.089954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1914699-3c0e-42ac-b63e-7df14da9703d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.090093 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1914699-3c0e-42ac-b63e-7df14da9703d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.095692 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1914699-3c0e-42ac-b63e-7df14da9703d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.117572 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq2nq\" (UniqueName: \"kubernetes.io/projected/a1914699-3c0e-42ac-b63e-7df14da9703d-kube-api-access-rq2nq\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.120323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1914699-3c0e-42ac-b63e-7df14da9703d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a1914699-3c0e-42ac-b63e-7df14da9703d\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.202040 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.341914 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": read tcp 10.217.0.2:53302->10.217.1.71:8775: read: connection reset by peer" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.341930 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": read tcp 10.217.0.2:53316->10.217.1.71:8775: read: connection reset by peer" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.505053 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.505305 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93" gracePeriod=30 Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.779004 4962 generic.go:334] "Generic (PLEG): container finished" podID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerID="2a0a8a9f417b3e291db7d3ec635c7e770ebcb120600d244e09183e67a60b53ed" exitCode=0 Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.779437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0d11367-1fa5-472a-8046-6e1719195a2f","Type":"ContainerDied","Data":"2a0a8a9f417b3e291db7d3ec635c7e770ebcb120600d244e09183e67a60b53ed"} Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.788314 4962 generic.go:334] "Generic (PLEG): container finished" podID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerID="4b9d0922d2d307ae481566520c5ad7f9198f1b50f2a3e0ca42d2bd70b25ef450" exitCode=0 Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.788376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f01f82f4-f27f-4da3-83f7-ac88ca54d880","Type":"ContainerDied","Data":"4b9d0922d2d307ae481566520c5ad7f9198f1b50f2a3e0ca42d2bd70b25ef450"} Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.797606 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.802162 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6c3dd24d-75e3-422a-ac9a-efeee6404f74","Type":"ContainerStarted","Data":"4124315dd212bca480021b98122208922db4297265f2067bd61c34dddbe0c5e6"} Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.832406 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.832384342 podStartE2EDuration="2.832384342s" podCreationTimestamp="2025-10-03 14:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:49.819531816 +0000 UTC m=+5938.223429651" watchObservedRunningTime="2025-10-03 14:28:49.832384342 +0000 UTC m=+5938.236282177" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.985520 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:28:49 crc kubenswrapper[4962]: I1003 14:28:49.992552 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.034274 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfzzp\" (UniqueName: \"kubernetes.io/projected/a0d11367-1fa5-472a-8046-6e1719195a2f-kube-api-access-qfzzp\") pod \"a0d11367-1fa5-472a-8046-6e1719195a2f\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.034788 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-config-data\") pod \"a0d11367-1fa5-472a-8046-6e1719195a2f\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.034840 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d11367-1fa5-472a-8046-6e1719195a2f-logs\") pod \"a0d11367-1fa5-472a-8046-6e1719195a2f\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.036620 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01f82f4-f27f-4da3-83f7-ac88ca54d880-logs\") pod \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.036682 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-combined-ca-bundle\") pod \"a0d11367-1fa5-472a-8046-6e1719195a2f\" (UID: \"a0d11367-1fa5-472a-8046-6e1719195a2f\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.036764 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9chd\" (UniqueName: \"kubernetes.io/projected/f01f82f4-f27f-4da3-83f7-ac88ca54d880-kube-api-access-q9chd\") pod \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.036784 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-combined-ca-bundle\") pod \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.036823 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-config-data\") pod \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\" (UID: \"f01f82f4-f27f-4da3-83f7-ac88ca54d880\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.040748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01f82f4-f27f-4da3-83f7-ac88ca54d880-logs" (OuterVolumeSpecName: "logs") pod "f01f82f4-f27f-4da3-83f7-ac88ca54d880" (UID: "f01f82f4-f27f-4da3-83f7-ac88ca54d880"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.042247 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d11367-1fa5-472a-8046-6e1719195a2f-logs" (OuterVolumeSpecName: "logs") pod "a0d11367-1fa5-472a-8046-6e1719195a2f" (UID: "a0d11367-1fa5-472a-8046-6e1719195a2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.042952 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d11367-1fa5-472a-8046-6e1719195a2f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.042976 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01f82f4-f27f-4da3-83f7-ac88ca54d880-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.079893 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01f82f4-f27f-4da3-83f7-ac88ca54d880-kube-api-access-q9chd" (OuterVolumeSpecName: "kube-api-access-q9chd") pod "f01f82f4-f27f-4da3-83f7-ac88ca54d880" (UID: "f01f82f4-f27f-4da3-83f7-ac88ca54d880"). InnerVolumeSpecName "kube-api-access-q9chd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.085949 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d11367-1fa5-472a-8046-6e1719195a2f-kube-api-access-qfzzp" (OuterVolumeSpecName: "kube-api-access-qfzzp") pod "a0d11367-1fa5-472a-8046-6e1719195a2f" (UID: "a0d11367-1fa5-472a-8046-6e1719195a2f"). InnerVolumeSpecName "kube-api-access-qfzzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.088926 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f01f82f4-f27f-4da3-83f7-ac88ca54d880" (UID: "f01f82f4-f27f-4da3-83f7-ac88ca54d880"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.115849 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0d11367-1fa5-472a-8046-6e1719195a2f" (UID: "a0d11367-1fa5-472a-8046-6e1719195a2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.120349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-config-data" (OuterVolumeSpecName: "config-data") pod "f01f82f4-f27f-4da3-83f7-ac88ca54d880" (UID: "f01f82f4-f27f-4da3-83f7-ac88ca54d880"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.120384 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-config-data" (OuterVolumeSpecName: "config-data") pod "a0d11367-1fa5-472a-8046-6e1719195a2f" (UID: "a0d11367-1fa5-472a-8046-6e1719195a2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.154067 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.154107 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.154117 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9chd\" (UniqueName: \"kubernetes.io/projected/f01f82f4-f27f-4da3-83f7-ac88ca54d880-kube-api-access-q9chd\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.154130 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01f82f4-f27f-4da3-83f7-ac88ca54d880-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.154138 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d11367-1fa5-472a-8046-6e1719195a2f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.154147 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfzzp\" (UniqueName: \"kubernetes.io/projected/a0d11367-1fa5-472a-8046-6e1719195a2f-kube-api-access-qfzzp\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.252008 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8361c760-4e3d-46d1-b1ad-73826855e693" path="/var/lib/kubelet/pods/8361c760-4e3d-46d1-b1ad-73826855e693/volumes" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.642686 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.793203 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-config-data\") pod \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.793339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-combined-ca-bundle\") pod \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.793466 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl298\" (UniqueName: \"kubernetes.io/projected/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-kube-api-access-pl298\") pod \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\" (UID: \"04b5ed01-2476-408b-b9d8-ff1ef2ad5923\") " Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.803923 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-kube-api-access-pl298" (OuterVolumeSpecName: "kube-api-access-pl298") pod "04b5ed01-2476-408b-b9d8-ff1ef2ad5923" (UID: "04b5ed01-2476-408b-b9d8-ff1ef2ad5923"). InnerVolumeSpecName "kube-api-access-pl298". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.826078 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04b5ed01-2476-408b-b9d8-ff1ef2ad5923" (UID: "04b5ed01-2476-408b-b9d8-ff1ef2ad5923"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.826192 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a1914699-3c0e-42ac-b63e-7df14da9703d","Type":"ContainerStarted","Data":"b2e1039f431de5efa72d15902f72d51358d9fc7a01dbac732ff020f192b8a2c0"} Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.826233 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a1914699-3c0e-42ac-b63e-7df14da9703d","Type":"ContainerStarted","Data":"afc17e8e0d104c1310f64e6429c7f9cc022adbfe0a6f04067c548101e78cb0ce"} Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.826270 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.832980 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.833743 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0d11367-1fa5-472a-8046-6e1719195a2f","Type":"ContainerDied","Data":"61a9e2665b94e240620984b209272020cebeebe94403a767a46406a817c19322"} Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.833773 4962 scope.go:117] "RemoveContainer" containerID="2a0a8a9f417b3e291db7d3ec635c7e770ebcb120600d244e09183e67a60b53ed" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.836796 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-config-data" (OuterVolumeSpecName: "config-data") pod "04b5ed01-2476-408b-b9d8-ff1ef2ad5923" (UID: "04b5ed01-2476-408b-b9d8-ff1ef2ad5923"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.842552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f01f82f4-f27f-4da3-83f7-ac88ca54d880","Type":"ContainerDied","Data":"b4dbb04ce8bcb292491db4ff9a8665903f8c37c05b245d78b6c6d5a839072985"} Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.842691 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.851580 4962 generic.go:334] "Generic (PLEG): container finished" podID="04b5ed01-2476-408b-b9d8-ff1ef2ad5923" containerID="80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e" exitCode=0 Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.852480 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.857556 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b5ed01-2476-408b-b9d8-ff1ef2ad5923","Type":"ContainerDied","Data":"80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e"} Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.857590 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b5ed01-2476-408b-b9d8-ff1ef2ad5923","Type":"ContainerDied","Data":"295b352c921d3fe5c2a53877ac136cced22eb277485cbc910392670a2975cdc9"} Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.857929 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.857906067 podStartE2EDuration="2.857906067s" podCreationTimestamp="2025-10-03 14:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:50.850059641 +0000 UTC m=+5939.253957496" watchObservedRunningTime="2025-10-03 14:28:50.857906067 +0000 UTC m=+5939.261803922" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.895095 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.895130 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.895144 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl298\" (UniqueName: \"kubernetes.io/projected/04b5ed01-2476-408b-b9d8-ff1ef2ad5923-kube-api-access-pl298\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.909777 4962 scope.go:117] "RemoveContainer" containerID="b46a9e8932e8f60d3bbcf46554828a868eeaa55aadf37646ed5908c2fe339b8d" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.917107 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.942143 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.964125 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.976705 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 14:28:50 crc kubenswrapper[4962]: E1003 14:28:50.977483 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-log" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.977556 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-log" Oct 03 14:28:50 crc kubenswrapper[4962]: E1003 14:28:50.977622 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b5ed01-2476-408b-b9d8-ff1ef2ad5923" containerName="nova-scheduler-scheduler" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.977699 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b5ed01-2476-408b-b9d8-ff1ef2ad5923" containerName="nova-scheduler-scheduler" Oct 03 14:28:50 crc kubenswrapper[4962]: E1003 14:28:50.977759 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-log" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.977950 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-log" Oct 03 14:28:50 crc kubenswrapper[4962]: E1003 14:28:50.978034 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-metadata" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.978087 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-metadata" Oct 03 14:28:50 crc kubenswrapper[4962]: E1003 14:28:50.978144 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-api" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.978194 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-api" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.978441 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-log" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.978580 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" containerName="nova-api-api" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.978690 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-metadata" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.978752 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" containerName="nova-metadata-log" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.978812 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b5ed01-2476-408b-b9d8-ff1ef2ad5923" containerName="nova-scheduler-scheduler" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.980065 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.977910 4962 scope.go:117] "RemoveContainer" containerID="4b9d0922d2d307ae481566520c5ad7f9198f1b50f2a3e0ca42d2bd70b25ef450" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.985224 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 14:28:50 crc kubenswrapper[4962]: I1003 14:28:50.989897 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.008045 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.023872 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.034994 4962 scope.go:117] "RemoveContainer" containerID="65da81c5418996928420f6548ad1d29bc95105e66db97ee2295873d32175a4d0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.036688 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.038886 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.041696 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.051456 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.063687 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.065825 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.069225 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.075105 4962 scope.go:117] "RemoveContainer" containerID="80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.092234 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.100521 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef39afc8-f9ff-4a08-8839-4271da194934-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.100609 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef39afc8-f9ff-4a08-8839-4271da194934-logs\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.100660 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef39afc8-f9ff-4a08-8839-4271da194934-config-data\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.100688 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rskpw\" (UniqueName: \"kubernetes.io/projected/ef39afc8-f9ff-4a08-8839-4271da194934-kube-api-access-rskpw\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.105691 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.111409 4962 scope.go:117] "RemoveContainer" containerID="80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e" Oct 03 14:28:51 crc kubenswrapper[4962]: E1003 14:28:51.112029 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e\": container with ID starting with 80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e not found: ID does not exist" containerID="80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.112078 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e"} err="failed to get container status \"80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e\": rpc error: code = NotFound desc = could not find container \"80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e\": container with ID starting with 80ae291d913375dbf692d74eacbfec5d2ee2fb4869fb8f13b79decf5019fde7e not found: ID does not exist" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.203264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef39afc8-f9ff-4a08-8839-4271da194934-config-data\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.203558 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rskpw\" (UniqueName: \"kubernetes.io/projected/ef39afc8-f9ff-4a08-8839-4271da194934-kube-api-access-rskpw\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.203682 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a699aa9-c62b-4255-ac30-3d9e57fa9905-logs\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.203783 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a699aa9-c62b-4255-ac30-3d9e57fa9905-config-data\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.203939 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002ef092-a0a8-4a68-92aa-36a23acb13ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.204061 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmkh\" (UniqueName: \"kubernetes.io/projected/002ef092-a0a8-4a68-92aa-36a23acb13ee-kube-api-access-brmkh\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.204208 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a699aa9-c62b-4255-ac30-3d9e57fa9905-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.204307 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6c6t\" (UniqueName: \"kubernetes.io/projected/3a699aa9-c62b-4255-ac30-3d9e57fa9905-kube-api-access-f6c6t\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.204425 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002ef092-a0a8-4a68-92aa-36a23acb13ee-config-data\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.204531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef39afc8-f9ff-4a08-8839-4271da194934-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.204626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef39afc8-f9ff-4a08-8839-4271da194934-logs\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.205079 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef39afc8-f9ff-4a08-8839-4271da194934-logs\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.207329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef39afc8-f9ff-4a08-8839-4271da194934-config-data\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.217385 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef39afc8-f9ff-4a08-8839-4271da194934-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.219528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rskpw\" (UniqueName: \"kubernetes.io/projected/ef39afc8-f9ff-4a08-8839-4271da194934-kube-api-access-rskpw\") pod \"nova-api-0\" (UID: \"ef39afc8-f9ff-4a08-8839-4271da194934\") " pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.306382 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a699aa9-c62b-4255-ac30-3d9e57fa9905-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.306428 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6c6t\" (UniqueName: \"kubernetes.io/projected/3a699aa9-c62b-4255-ac30-3d9e57fa9905-kube-api-access-f6c6t\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.307006 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002ef092-a0a8-4a68-92aa-36a23acb13ee-config-data\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.306841 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.307070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a699aa9-c62b-4255-ac30-3d9e57fa9905-logs\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.307106 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a699aa9-c62b-4255-ac30-3d9e57fa9905-config-data\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.307157 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002ef092-a0a8-4a68-92aa-36a23acb13ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.307195 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brmkh\" (UniqueName: \"kubernetes.io/projected/002ef092-a0a8-4a68-92aa-36a23acb13ee-kube-api-access-brmkh\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.307772 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a699aa9-c62b-4255-ac30-3d9e57fa9905-logs\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.311734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a699aa9-c62b-4255-ac30-3d9e57fa9905-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.312279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002ef092-a0a8-4a68-92aa-36a23acb13ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.313072 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a699aa9-c62b-4255-ac30-3d9e57fa9905-config-data\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.313704 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002ef092-a0a8-4a68-92aa-36a23acb13ee-config-data\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.332790 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmkh\" (UniqueName: \"kubernetes.io/projected/002ef092-a0a8-4a68-92aa-36a23acb13ee-kube-api-access-brmkh\") pod \"nova-scheduler-0\" (UID: \"002ef092-a0a8-4a68-92aa-36a23acb13ee\") " pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.338224 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6c6t\" (UniqueName: \"kubernetes.io/projected/3a699aa9-c62b-4255-ac30-3d9e57fa9905-kube-api-access-f6c6t\") pod \"nova-metadata-0\" (UID: \"3a699aa9-c62b-4255-ac30-3d9e57fa9905\") " pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.358701 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.394385 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.809921 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.884314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef39afc8-f9ff-4a08-8839-4271da194934","Type":"ContainerStarted","Data":"fcf8c987c0413a410368e38d5f5f85f5d8f7339c92d258220aaf56403c1f6b58"} Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.913580 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:28:51 crc kubenswrapper[4962]: W1003 14:28:51.919770 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a699aa9_c62b_4255_ac30_3d9e57fa9905.slice/crio-facbbb9e5ffc11550700530a7003316e6e52473baf98ac00331b40e19472ee1f WatchSource:0}: Error finding container facbbb9e5ffc11550700530a7003316e6e52473baf98ac00331b40e19472ee1f: Status 404 returned error can't find the container with id facbbb9e5ffc11550700530a7003316e6e52473baf98ac00331b40e19472ee1f Oct 03 14:28:51 crc kubenswrapper[4962]: W1003 14:28:51.920769 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod002ef092_a0a8_4a68_92aa_36a23acb13ee.slice/crio-e631fd24499b10f266ff8bff5bee00efa0a7d564902a4e942efa9185178987c1 WatchSource:0}: Error finding container e631fd24499b10f266ff8bff5bee00efa0a7d564902a4e942efa9185178987c1: Status 404 returned error can't find the container with id e631fd24499b10f266ff8bff5bee00efa0a7d564902a4e942efa9185178987c1 Oct 03 14:28:51 crc kubenswrapper[4962]: I1003 14:28:51.922288 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:28:52 crc kubenswrapper[4962]: E1003 14:28:52.019392 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 14:28:52 crc kubenswrapper[4962]: E1003 14:28:52.025219 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 14:28:52 crc kubenswrapper[4962]: E1003 14:28:52.042185 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 14:28:52 crc kubenswrapper[4962]: E1003 14:28:52.042259 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" containerName="nova-cell1-conductor-conductor" Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.250264 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b5ed01-2476-408b-b9d8-ff1ef2ad5923" path="/var/lib/kubelet/pods/04b5ed01-2476-408b-b9d8-ff1ef2ad5923/volumes" Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.251610 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d11367-1fa5-472a-8046-6e1719195a2f" path="/var/lib/kubelet/pods/a0d11367-1fa5-472a-8046-6e1719195a2f/volumes" Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.252452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01f82f4-f27f-4da3-83f7-ac88ca54d880" path="/var/lib/kubelet/pods/f01f82f4-f27f-4da3-83f7-ac88ca54d880/volumes" Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.894230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef39afc8-f9ff-4a08-8839-4271da194934","Type":"ContainerStarted","Data":"14a5949c051d1942b8a1b294453a1910e868f406e522d19161b20e815d721cfe"} Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.895786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef39afc8-f9ff-4a08-8839-4271da194934","Type":"ContainerStarted","Data":"0feeeba1547f01e99a57fb545435d6aacab2ee310b8ee881529d62501c03242b"} Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.897695 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"002ef092-a0a8-4a68-92aa-36a23acb13ee","Type":"ContainerStarted","Data":"d2b6968a849fef211fc8097c06e1f2602f0abbadbb39fbcdb95b0265c352ea1b"} Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.897751 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"002ef092-a0a8-4a68-92aa-36a23acb13ee","Type":"ContainerStarted","Data":"e631fd24499b10f266ff8bff5bee00efa0a7d564902a4e942efa9185178987c1"} Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.900078 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a699aa9-c62b-4255-ac30-3d9e57fa9905","Type":"ContainerStarted","Data":"18f4a1f6e708b5702e301e528388d791ddd766472c733b960b02f5c96570efd9"} Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.900139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a699aa9-c62b-4255-ac30-3d9e57fa9905","Type":"ContainerStarted","Data":"869d45fbee166f8028e8364d6c99683daf1a1e8ebb3e745800584c6d4dbe74b6"} Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.900151 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a699aa9-c62b-4255-ac30-3d9e57fa9905","Type":"ContainerStarted","Data":"facbbb9e5ffc11550700530a7003316e6e52473baf98ac00331b40e19472ee1f"} Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.930668 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9306264520000003 podStartE2EDuration="2.930626452s" podCreationTimestamp="2025-10-03 14:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:52.921515464 +0000 UTC m=+5941.325413309" watchObservedRunningTime="2025-10-03 14:28:52.930626452 +0000 UTC m=+5941.334524287" Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.959955 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.959936318 podStartE2EDuration="2.959936318s" podCreationTimestamp="2025-10-03 14:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:52.951727564 +0000 UTC m=+5941.355625399" watchObservedRunningTime="2025-10-03 14:28:52.959936318 +0000 UTC m=+5941.363834153" Oct 03 14:28:52 crc kubenswrapper[4962]: I1003 14:28:52.986493 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.986462992 podStartE2EDuration="2.986462992s" podCreationTimestamp="2025-10-03 14:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:52.980801104 +0000 UTC m=+5941.384698949" watchObservedRunningTime="2025-10-03 14:28:52.986462992 +0000 UTC m=+5941.390360827" Oct 03 14:28:53 crc kubenswrapper[4962]: I1003 14:28:53.167451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.359766 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.360366 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.364699 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.397235 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.490618 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-config-data\") pod \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.490784 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-combined-ca-bundle\") pod \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.490852 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8p6v\" (UniqueName: \"kubernetes.io/projected/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-kube-api-access-f8p6v\") pod \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\" (UID: \"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64\") " Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.498656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-kube-api-access-f8p6v" (OuterVolumeSpecName: "kube-api-access-f8p6v") pod "aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" (UID: "aed19d7b-0bc6-42cb-8c2c-8df3cb502c64"). InnerVolumeSpecName "kube-api-access-f8p6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.516837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" (UID: "aed19d7b-0bc6-42cb-8c2c-8df3cb502c64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.525315 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-config-data" (OuterVolumeSpecName: "config-data") pod "aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" (UID: "aed19d7b-0bc6-42cb-8c2c-8df3cb502c64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.593831 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.593888 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8p6v\" (UniqueName: \"kubernetes.io/projected/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-kube-api-access-f8p6v\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.593907 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.937884 4962 generic.go:334] "Generic (PLEG): container finished" podID="aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" containerID="ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93" exitCode=0 Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.937955 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.937957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64","Type":"ContainerDied","Data":"ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93"} Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.938474 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aed19d7b-0bc6-42cb-8c2c-8df3cb502c64","Type":"ContainerDied","Data":"30bfbe37ae073421ddec9861ea9d381fc201ce17ffec4af74ecdb929b5ce6609"} Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.938497 4962 scope.go:117] "RemoveContainer" containerID="ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.991806 4962 scope.go:117] "RemoveContainer" containerID="ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93" Oct 03 14:28:56 crc kubenswrapper[4962]: E1003 14:28:56.995777 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93\": container with ID starting with ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93 not found: ID does not exist" containerID="ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93" Oct 03 14:28:56 crc kubenswrapper[4962]: I1003 14:28:56.995823 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93"} err="failed to get container status \"ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93\": rpc error: code = NotFound desc = could not find container \"ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93\": container with ID starting with ff3c60e5e06c811e32286f5333fe38257dc28823c9a89015a06bfe3d2196ca93 not found: ID does not exist" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.030689 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.054715 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.083705 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:28:57 crc kubenswrapper[4962]: E1003 14:28:57.084168 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" containerName="nova-cell1-conductor-conductor" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.084185 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" containerName="nova-cell1-conductor-conductor" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.084374 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" containerName="nova-cell1-conductor-conductor" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.085031 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.087359 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.094289 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.203944 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/690ab65f-bd9a-48d4-90f7-245ddeafc5da-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.204541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/690ab65f-bd9a-48d4-90f7-245ddeafc5da-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.204620 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjll8\" (UniqueName: \"kubernetes.io/projected/690ab65f-bd9a-48d4-90f7-245ddeafc5da-kube-api-access-qjll8\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.228284 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:28:57 crc kubenswrapper[4962]: E1003 14:28:57.228479 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.306172 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjll8\" (UniqueName: \"kubernetes.io/projected/690ab65f-bd9a-48d4-90f7-245ddeafc5da-kube-api-access-qjll8\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.306267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/690ab65f-bd9a-48d4-90f7-245ddeafc5da-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.306380 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/690ab65f-bd9a-48d4-90f7-245ddeafc5da-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.311569 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/690ab65f-bd9a-48d4-90f7-245ddeafc5da-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.311575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/690ab65f-bd9a-48d4-90f7-245ddeafc5da-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.325418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjll8\" (UniqueName: \"kubernetes.io/projected/690ab65f-bd9a-48d4-90f7-245ddeafc5da-kube-api-access-qjll8\") pod \"nova-cell1-conductor-0\" (UID: \"690ab65f-bd9a-48d4-90f7-245ddeafc5da\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.421803 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.853659 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:28:57 crc kubenswrapper[4962]: I1003 14:28:57.951889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"690ab65f-bd9a-48d4-90f7-245ddeafc5da","Type":"ContainerStarted","Data":"2de9bb718b19213ab98b2a0eaf6e2ed57dc815b4d1747f4d806eba4ba731b823"} Oct 03 14:28:58 crc kubenswrapper[4962]: I1003 14:28:58.167183 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:58 crc kubenswrapper[4962]: I1003 14:28:58.178604 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:58 crc kubenswrapper[4962]: I1003 14:28:58.241890 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed19d7b-0bc6-42cb-8c2c-8df3cb502c64" path="/var/lib/kubelet/pods/aed19d7b-0bc6-42cb-8c2c-8df3cb502c64/volumes" Oct 03 14:28:58 crc kubenswrapper[4962]: I1003 14:28:58.964339 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"690ab65f-bd9a-48d4-90f7-245ddeafc5da","Type":"ContainerStarted","Data":"faeaa6489d94e7052ac9b89755ff652bdbd8814d12fbe34eb0fc3ea7bc8b94c0"} Oct 03 14:28:58 crc kubenswrapper[4962]: I1003 14:28:58.987365 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:28:58 crc kubenswrapper[4962]: I1003 14:28:58.994509 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.9944871549999998 podStartE2EDuration="1.994487155s" podCreationTimestamp="2025-10-03 14:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:28:58.982157253 +0000 UTC m=+5947.386055098" watchObservedRunningTime="2025-10-03 14:28:58.994487155 +0000 UTC m=+5947.398384990" Oct 03 14:28:59 crc kubenswrapper[4962]: I1003 14:28:59.237357 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 14:28:59 crc kubenswrapper[4962]: I1003 14:28:59.975610 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 14:29:01 crc kubenswrapper[4962]: I1003 14:29:01.307746 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:29:01 crc kubenswrapper[4962]: I1003 14:29:01.308103 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:29:01 crc kubenswrapper[4962]: I1003 14:29:01.360580 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:29:01 crc kubenswrapper[4962]: I1003 14:29:01.360682 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:29:01 crc kubenswrapper[4962]: I1003 14:29:01.395483 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 14:29:01 crc kubenswrapper[4962]: I1003 14:29:01.442939 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 14:29:02 crc kubenswrapper[4962]: I1003 14:29:02.028060 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 14:29:02 crc kubenswrapper[4962]: I1003 14:29:02.349946 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef39afc8-f9ff-4a08-8839-4271da194934" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:29:02 crc kubenswrapper[4962]: I1003 14:29:02.473847 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3a699aa9-c62b-4255-ac30-3d9e57fa9905" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:29:02 crc kubenswrapper[4962]: I1003 14:29:02.474149 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef39afc8-f9ff-4a08-8839-4271da194934" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:29:02 crc kubenswrapper[4962]: I1003 14:29:02.474166 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3a699aa9-c62b-4255-ac30-3d9e57fa9905" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.553311 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.560377 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.562498 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.567493 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.692696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.692770 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.692917 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.693100 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.693263 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhrgg\" (UniqueName: \"kubernetes.io/projected/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-kube-api-access-mhrgg\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.693325 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.794445 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.794531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.794585 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.794603 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.794630 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.794668 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.794911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhrgg\" (UniqueName: \"kubernetes.io/projected/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-kube-api-access-mhrgg\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.799947 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.800527 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.809454 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.809515 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.811827 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhrgg\" (UniqueName: \"kubernetes.io/projected/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-kube-api-access-mhrgg\") pod \"cinder-scheduler-0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:05 crc kubenswrapper[4962]: I1003 14:29:05.887772 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:29:06 crc kubenswrapper[4962]: I1003 14:29:06.316672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:06 crc kubenswrapper[4962]: W1003 14:29:06.319030 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ed11faa_654c_4ea7_8f20_f4172d05cfb0.slice/crio-83f3daac14960de227b0e52882a1f5c6461208661c0cc8f3e74a0c33f0e13dfc WatchSource:0}: Error finding container 83f3daac14960de227b0e52882a1f5c6461208661c0cc8f3e74a0c33f0e13dfc: Status 404 returned error can't find the container with id 83f3daac14960de227b0e52882a1f5c6461208661c0cc8f3e74a0c33f0e13dfc Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.056294 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ed11faa-654c-4ea7-8f20-f4172d05cfb0","Type":"ContainerStarted","Data":"55405c2badae6285d40770397254c4cbf7e84f334adee158d50ff6ea4d5f199b"} Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.056832 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ed11faa-654c-4ea7-8f20-f4172d05cfb0","Type":"ContainerStarted","Data":"83f3daac14960de227b0e52882a1f5c6461208661c0cc8f3e74a0c33f0e13dfc"} Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.069040 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.069311 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api-log" containerID="cri-o://d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f" gracePeriod=30 Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.069392 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api" containerID="cri-o://6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f" gracePeriod=30 Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.453402 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.492494 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.494560 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.498165 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.501126 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629289 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629621 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629736 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629779 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-dev\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-sys\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmdhf\" (UniqueName: \"kubernetes.io/projected/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-kube-api-access-dmdhf\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629852 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629867 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629891 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-run\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.629987 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.630009 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.630028 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.630047 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.630062 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731387 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731423 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731449 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731512 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731549 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731762 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731805 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731833 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-dev\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731877 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-sys\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmdhf\" (UniqueName: \"kubernetes.io/projected/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-kube-api-access-dmdhf\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731937 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731964 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.731999 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-run\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.732060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.732103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.732285 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.732345 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-dev\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.732386 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-sys\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.732460 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.733183 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-run\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.733318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.733371 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.733403 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.733703 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.750495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.751103 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.752055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.752471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.752908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.759012 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.794542 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmdhf\" (UniqueName: \"kubernetes.io/projected/fef1b0d4-c55c-4ee8-8e5a-e7da5696c273-kube-api-access-dmdhf\") pod \"cinder-volume-volume1-0\" (UID: \"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273\") " pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:07 crc kubenswrapper[4962]: I1003 14:29:07.817188 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.084461 4962 generic.go:334] "Generic (PLEG): container finished" podID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerID="d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f" exitCode=143 Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.085058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b9af1df-7637-45ab-98ee-85241bf8826a","Type":"ContainerDied","Data":"d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f"} Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.090445 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ed11faa-654c-4ea7-8f20-f4172d05cfb0","Type":"ContainerStarted","Data":"5bb4201e4893460715c0f5b405900da08f0188c1f2205a29141908983b0bc176"} Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.127253 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.127229802 podStartE2EDuration="3.127229802s" podCreationTimestamp="2025-10-03 14:29:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:29:08.115250388 +0000 UTC m=+5956.519148223" watchObservedRunningTime="2025-10-03 14:29:08.127229802 +0000 UTC m=+5956.531127637" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.177430 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.179217 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.186063 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.197413 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.245926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-lib-modules\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.245985 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246017 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/95cd6540-175f-480c-98b9-4864c792528f-ceph\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246074 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246091 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-scripts\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-config-data\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246182 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5lv\" (UniqueName: \"kubernetes.io/projected/95cd6540-175f-480c-98b9-4864c792528f-kube-api-access-7x5lv\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246201 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-run\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246246 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246272 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-dev\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246311 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-sys\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.246328 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.347931 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/95cd6540-175f-480c-98b9-4864c792528f-ceph\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348018 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348047 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-scripts\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348091 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-config-data\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348126 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348181 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5lv\" (UniqueName: \"kubernetes.io/projected/95cd6540-175f-480c-98b9-4864c792528f-kube-api-access-7x5lv\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348208 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-run\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348275 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-dev\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348359 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-sys\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348379 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348420 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-lib-modules\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.348821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-dev\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.349245 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.349303 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-sys\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.349281 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-run\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.349353 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-lib-modules\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.349314 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.349331 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/95cd6540-175f-480c-98b9-4864c792528f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.358202 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.362036 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-config-data\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.368076 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-scripts\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.368430 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/95cd6540-175f-480c-98b9-4864c792528f-ceph\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.374297 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5lv\" (UniqueName: \"kubernetes.io/projected/95cd6540-175f-480c-98b9-4864c792528f-kube-api-access-7x5lv\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.374393 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95cd6540-175f-480c-98b9-4864c792528f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"95cd6540-175f-480c-98b9-4864c792528f\") " pod="openstack/cinder-backup-0" Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.412873 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 03 14:29:08 crc kubenswrapper[4962]: W1003 14:29:08.414952 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfef1b0d4_c55c_4ee8_8e5a_e7da5696c273.slice/crio-78c54cd73984f4a936c7b4127a59d2c5e8800883ce74a8aad09dacd520fdd0f7 WatchSource:0}: Error finding container 78c54cd73984f4a936c7b4127a59d2c5e8800883ce74a8aad09dacd520fdd0f7: Status 404 returned error can't find the container with id 78c54cd73984f4a936c7b4127a59d2c5e8800883ce74a8aad09dacd520fdd0f7 Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.418474 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:29:08 crc kubenswrapper[4962]: I1003 14:29:08.564964 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 03 14:29:09 crc kubenswrapper[4962]: I1003 14:29:09.099065 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 03 14:29:09 crc kubenswrapper[4962]: W1003 14:29:09.114839 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95cd6540_175f_480c_98b9_4864c792528f.slice/crio-a45ca9cca24d18820e0f207db439161fb7078189e66fee9b3c9b7bac1afd9b3d WatchSource:0}: Error finding container a45ca9cca24d18820e0f207db439161fb7078189e66fee9b3c9b7bac1afd9b3d: Status 404 returned error can't find the container with id a45ca9cca24d18820e0f207db439161fb7078189e66fee9b3c9b7bac1afd9b3d Oct 03 14:29:09 crc kubenswrapper[4962]: I1003 14:29:09.127309 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273","Type":"ContainerStarted","Data":"78c54cd73984f4a936c7b4127a59d2c5e8800883ce74a8aad09dacd520fdd0f7"} Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.160716 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"95cd6540-175f-480c-98b9-4864c792528f","Type":"ContainerStarted","Data":"16a78d7e55f0fd1b2395d053d78632a30b62e807f41984f1297e2839e498402b"} Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.161300 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"95cd6540-175f-480c-98b9-4864c792528f","Type":"ContainerStarted","Data":"a45ca9cca24d18820e0f207db439161fb7078189e66fee9b3c9b7bac1afd9b3d"} Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.177111 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273","Type":"ContainerStarted","Data":"d602ee6e6910ea2c40df574c080817fcd54c9c37206bdfb36849f2988e4029b2"} Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.177148 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"fef1b0d4-c55c-4ee8-8e5a-e7da5696c273","Type":"ContainerStarted","Data":"321da016a972f06aa97305f66c2b33b7b5c92d362edd6b04a5a7d3a93a211500"} Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.216250 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.472461666 podStartE2EDuration="3.216227613s" podCreationTimestamp="2025-10-03 14:29:07 +0000 UTC" firstStartedPulling="2025-10-03 14:29:08.418287122 +0000 UTC m=+5956.822184957" lastFinishedPulling="2025-10-03 14:29:09.162053069 +0000 UTC m=+5957.565950904" observedRunningTime="2025-10-03 14:29:10.209200209 +0000 UTC m=+5958.613098064" watchObservedRunningTime="2025-10-03 14:29:10.216227613 +0000 UTC m=+5958.620125438" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.227845 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:29:10 crc kubenswrapper[4962]: E1003 14:29:10.228278 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.241357 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.78:8776/healthcheck\": read tcp 10.217.0.2:42256->10.217.1.78:8776: read: connection reset by peer" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.586281 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.702387 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data-custom\") pod \"7b9af1df-7637-45ab-98ee-85241bf8826a\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.702574 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9af1df-7637-45ab-98ee-85241bf8826a-logs\") pod \"7b9af1df-7637-45ab-98ee-85241bf8826a\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.702613 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data\") pod \"7b9af1df-7637-45ab-98ee-85241bf8826a\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.702626 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-scripts\") pod \"7b9af1df-7637-45ab-98ee-85241bf8826a\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.702653 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b9af1df-7637-45ab-98ee-85241bf8826a-etc-machine-id\") pod \"7b9af1df-7637-45ab-98ee-85241bf8826a\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.702744 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmmcv\" (UniqueName: \"kubernetes.io/projected/7b9af1df-7637-45ab-98ee-85241bf8826a-kube-api-access-zmmcv\") pod \"7b9af1df-7637-45ab-98ee-85241bf8826a\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.702781 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-combined-ca-bundle\") pod \"7b9af1df-7637-45ab-98ee-85241bf8826a\" (UID: \"7b9af1df-7637-45ab-98ee-85241bf8826a\") " Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.703273 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b9af1df-7637-45ab-98ee-85241bf8826a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7b9af1df-7637-45ab-98ee-85241bf8826a" (UID: "7b9af1df-7637-45ab-98ee-85241bf8826a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.704543 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9af1df-7637-45ab-98ee-85241bf8826a-logs" (OuterVolumeSpecName: "logs") pod "7b9af1df-7637-45ab-98ee-85241bf8826a" (UID: "7b9af1df-7637-45ab-98ee-85241bf8826a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.714485 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9af1df-7637-45ab-98ee-85241bf8826a-kube-api-access-zmmcv" (OuterVolumeSpecName: "kube-api-access-zmmcv") pod "7b9af1df-7637-45ab-98ee-85241bf8826a" (UID: "7b9af1df-7637-45ab-98ee-85241bf8826a"). InnerVolumeSpecName "kube-api-access-zmmcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.716203 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-scripts" (OuterVolumeSpecName: "scripts") pod "7b9af1df-7637-45ab-98ee-85241bf8826a" (UID: "7b9af1df-7637-45ab-98ee-85241bf8826a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.717855 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7b9af1df-7637-45ab-98ee-85241bf8826a" (UID: "7b9af1df-7637-45ab-98ee-85241bf8826a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.738754 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b9af1df-7637-45ab-98ee-85241bf8826a" (UID: "7b9af1df-7637-45ab-98ee-85241bf8826a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.754720 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data" (OuterVolumeSpecName: "config-data") pod "7b9af1df-7637-45ab-98ee-85241bf8826a" (UID: "7b9af1df-7637-45ab-98ee-85241bf8826a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.804800 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9af1df-7637-45ab-98ee-85241bf8826a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.804838 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.804852 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.804862 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b9af1df-7637-45ab-98ee-85241bf8826a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.804876 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmmcv\" (UniqueName: \"kubernetes.io/projected/7b9af1df-7637-45ab-98ee-85241bf8826a-kube-api-access-zmmcv\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.804889 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.804902 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9af1df-7637-45ab-98ee-85241bf8826a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:10 crc kubenswrapper[4962]: I1003 14:29:10.888780 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.190089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"95cd6540-175f-480c-98b9-4864c792528f","Type":"ContainerStarted","Data":"57a073b375ea8a156e44fc38c3f86e948dacfa99bec15ce93eeb1142df9d4005"} Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.193213 4962 generic.go:334] "Generic (PLEG): container finished" podID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerID="6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f" exitCode=0 Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.193341 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b9af1df-7637-45ab-98ee-85241bf8826a","Type":"ContainerDied","Data":"6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f"} Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.193409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b9af1df-7637-45ab-98ee-85241bf8826a","Type":"ContainerDied","Data":"44da2cd601e3ecb3fea59c80335d18096753ed120bf40a0ba468df22f4b230a8"} Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.193429 4962 scope.go:117] "RemoveContainer" containerID="6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.193555 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.232107 4962 scope.go:117] "RemoveContainer" containerID="d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.263628 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.51635921 podStartE2EDuration="3.263607609s" podCreationTimestamp="2025-10-03 14:29:08 +0000 UTC" firstStartedPulling="2025-10-03 14:29:09.120077531 +0000 UTC m=+5957.523975366" lastFinishedPulling="2025-10-03 14:29:09.86732593 +0000 UTC m=+5958.271223765" observedRunningTime="2025-10-03 14:29:11.211873796 +0000 UTC m=+5959.615771631" watchObservedRunningTime="2025-10-03 14:29:11.263607609 +0000 UTC m=+5959.667505444" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.282646 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.282922 4962 scope.go:117] "RemoveContainer" containerID="6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f" Oct 03 14:29:11 crc kubenswrapper[4962]: E1003 14:29:11.292815 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f\": container with ID starting with 6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f not found: ID does not exist" containerID="6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.293087 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f"} err="failed to get container status \"6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f\": rpc error: code = NotFound desc = could not find container \"6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f\": container with ID starting with 6254f9d3e13bcb639b73c43fe53353df4afebccf6434470fea00783caf514d9f not found: ID does not exist" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.293197 4962 scope.go:117] "RemoveContainer" containerID="d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f" Oct 03 14:29:11 crc kubenswrapper[4962]: E1003 14:29:11.293700 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f\": container with ID starting with d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f not found: ID does not exist" containerID="d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.293803 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f"} err="failed to get container status \"d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f\": rpc error: code = NotFound desc = could not find container \"d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f\": container with ID starting with d1b32c67649e71255aa65d4a73a58e17a73af88de15b90e4b50031f72ac03b0f not found: ID does not exist" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.296136 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.304693 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:29:11 crc kubenswrapper[4962]: E1003 14:29:11.305483 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api-log" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.305504 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api-log" Oct 03 14:29:11 crc kubenswrapper[4962]: E1003 14:29:11.305631 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.305663 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.306387 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api-log" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.306598 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" containerName="cinder-api" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.312561 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.313012 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.314438 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.314571 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.315645 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.315771 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.316075 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.327210 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.330757 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.361985 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.371942 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.372200 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.420529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.420623 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.420654 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-scripts\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.420694 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-config-data\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.420789 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-logs\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.420837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57s78\" (UniqueName: \"kubernetes.io/projected/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-kube-api-access-57s78\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.420861 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-config-data-custom\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.525093 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.525149 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-scripts\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.525180 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-config-data\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.525220 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-logs\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.525261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57s78\" (UniqueName: \"kubernetes.io/projected/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-kube-api-access-57s78\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.525288 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-config-data-custom\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.525316 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.525684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.526201 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-logs\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.531415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-scripts\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.532123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.532605 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-config-data-custom\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.545802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-config-data\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.552115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57s78\" (UniqueName: \"kubernetes.io/projected/63a4bfb1-9ada-4b7f-9ebb-a77a028ec821-kube-api-access-57s78\") pod \"cinder-api-0\" (UID: \"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821\") " pod="openstack/cinder-api-0" Oct 03 14:29:11 crc kubenswrapper[4962]: I1003 14:29:11.633092 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:29:12 crc kubenswrapper[4962]: I1003 14:29:12.100200 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:29:12 crc kubenswrapper[4962]: I1003 14:29:12.218332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821","Type":"ContainerStarted","Data":"4dc829fb9a9fb61143b27e3f033ee89191ea12c640eddd637384833c99bf5e1e"} Oct 03 14:29:12 crc kubenswrapper[4962]: I1003 14:29:12.222506 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 14:29:12 crc kubenswrapper[4962]: I1003 14:29:12.267324 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9af1df-7637-45ab-98ee-85241bf8826a" path="/var/lib/kubelet/pods/7b9af1df-7637-45ab-98ee-85241bf8826a/volumes" Oct 03 14:29:12 crc kubenswrapper[4962]: I1003 14:29:12.818070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:13 crc kubenswrapper[4962]: I1003 14:29:13.233331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821","Type":"ContainerStarted","Data":"a9a44258ead926712f7c9f2b995ae7d706d4afabd2d267825dc01c5ac9e73015"} Oct 03 14:29:13 crc kubenswrapper[4962]: I1003 14:29:13.565292 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 03 14:29:14 crc kubenswrapper[4962]: I1003 14:29:14.241557 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63a4bfb1-9ada-4b7f-9ebb-a77a028ec821","Type":"ContainerStarted","Data":"66f253edc55db4a29287e5ae37e193b594435a13bde89990aeff2c2318608557"} Oct 03 14:29:14 crc kubenswrapper[4962]: I1003 14:29:14.241920 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 14:29:16 crc kubenswrapper[4962]: I1003 14:29:16.091057 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 14:29:16 crc kubenswrapper[4962]: I1003 14:29:16.108916 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.10888717 podStartE2EDuration="5.10888717s" podCreationTimestamp="2025-10-03 14:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:29:14.258392264 +0000 UTC m=+5962.662290109" watchObservedRunningTime="2025-10-03 14:29:16.10888717 +0000 UTC m=+5964.512785045" Oct 03 14:29:16 crc kubenswrapper[4962]: I1003 14:29:16.151793 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:16 crc kubenswrapper[4962]: I1003 14:29:16.261557 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerName="cinder-scheduler" containerID="cri-o://55405c2badae6285d40770397254c4cbf7e84f334adee158d50ff6ea4d5f199b" gracePeriod=30 Oct 03 14:29:16 crc kubenswrapper[4962]: I1003 14:29:16.261608 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerName="probe" containerID="cri-o://5bb4201e4893460715c0f5b405900da08f0188c1f2205a29141908983b0bc176" gracePeriod=30 Oct 03 14:29:17 crc kubenswrapper[4962]: I1003 14:29:17.273568 4962 generic.go:334] "Generic (PLEG): container finished" podID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerID="5bb4201e4893460715c0f5b405900da08f0188c1f2205a29141908983b0bc176" exitCode=0 Oct 03 14:29:17 crc kubenswrapper[4962]: I1003 14:29:17.273651 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ed11faa-654c-4ea7-8f20-f4172d05cfb0","Type":"ContainerDied","Data":"5bb4201e4893460715c0f5b405900da08f0188c1f2205a29141908983b0bc176"} Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.022767 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.285983 4962 generic.go:334] "Generic (PLEG): container finished" podID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerID="55405c2badae6285d40770397254c4cbf7e84f334adee158d50ff6ea4d5f199b" exitCode=0 Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.286075 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ed11faa-654c-4ea7-8f20-f4172d05cfb0","Type":"ContainerDied","Data":"55405c2badae6285d40770397254c4cbf7e84f334adee158d50ff6ea4d5f199b"} Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.576880 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.669193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data-custom\") pod \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.670424 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhrgg\" (UniqueName: \"kubernetes.io/projected/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-kube-api-access-mhrgg\") pod \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.670464 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-combined-ca-bundle\") pod \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.670583 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-scripts\") pod \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.670968 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data\") pod \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.671012 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-etc-machine-id\") pod \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\" (UID: \"3ed11faa-654c-4ea7-8f20-f4172d05cfb0\") " Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.671611 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3ed11faa-654c-4ea7-8f20-f4172d05cfb0" (UID: "3ed11faa-654c-4ea7-8f20-f4172d05cfb0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.681862 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-scripts" (OuterVolumeSpecName: "scripts") pod "3ed11faa-654c-4ea7-8f20-f4172d05cfb0" (UID: "3ed11faa-654c-4ea7-8f20-f4172d05cfb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.681948 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-kube-api-access-mhrgg" (OuterVolumeSpecName: "kube-api-access-mhrgg") pod "3ed11faa-654c-4ea7-8f20-f4172d05cfb0" (UID: "3ed11faa-654c-4ea7-8f20-f4172d05cfb0"). InnerVolumeSpecName "kube-api-access-mhrgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.681979 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ed11faa-654c-4ea7-8f20-f4172d05cfb0" (UID: "3ed11faa-654c-4ea7-8f20-f4172d05cfb0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.724479 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ed11faa-654c-4ea7-8f20-f4172d05cfb0" (UID: "3ed11faa-654c-4ea7-8f20-f4172d05cfb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.774417 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.774453 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.774464 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.774474 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhrgg\" (UniqueName: \"kubernetes.io/projected/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-kube-api-access-mhrgg\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.774483 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.786136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data" (OuterVolumeSpecName: "config-data") pod "3ed11faa-654c-4ea7-8f20-f4172d05cfb0" (UID: "3ed11faa-654c-4ea7-8f20-f4172d05cfb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.812471 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 03 14:29:18 crc kubenswrapper[4962]: I1003 14:29:18.877464 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed11faa-654c-4ea7-8f20-f4172d05cfb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.296659 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ed11faa-654c-4ea7-8f20-f4172d05cfb0","Type":"ContainerDied","Data":"83f3daac14960de227b0e52882a1f5c6461208661c0cc8f3e74a0c33f0e13dfc"} Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.296724 4962 scope.go:117] "RemoveContainer" containerID="5bb4201e4893460715c0f5b405900da08f0188c1f2205a29141908983b0bc176" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.296760 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.327890 4962 scope.go:117] "RemoveContainer" containerID="55405c2badae6285d40770397254c4cbf7e84f334adee158d50ff6ea4d5f199b" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.333891 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.348900 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.375478 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:19 crc kubenswrapper[4962]: E1003 14:29:19.375976 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerName="cinder-scheduler" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.375991 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerName="cinder-scheduler" Oct 03 14:29:19 crc kubenswrapper[4962]: E1003 14:29:19.376009 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerName="probe" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.376015 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerName="probe" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.376213 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerName="cinder-scheduler" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.376228 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" containerName="probe" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.377261 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.379969 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.384522 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.494008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-scripts\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.494371 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.494459 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.494584 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.494765 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjq7\" (UniqueName: \"kubernetes.io/projected/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-kube-api-access-2fjq7\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.495092 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-config-data\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.596761 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-config-data\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.596848 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-scripts\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.596898 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.596924 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.596952 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.596976 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjq7\" (UniqueName: \"kubernetes.io/projected/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-kube-api-access-2fjq7\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.597047 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.601912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.602222 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.610854 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-config-data\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.611854 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-scripts\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.619225 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjq7\" (UniqueName: \"kubernetes.io/projected/74f65f90-9944-4bb5-a7dd-4f8fd8781be1-kube-api-access-2fjq7\") pod \"cinder-scheduler-0\" (UID: \"74f65f90-9944-4bb5-a7dd-4f8fd8781be1\") " pod="openstack/cinder-scheduler-0" Oct 03 14:29:19 crc kubenswrapper[4962]: I1003 14:29:19.702191 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:29:20 crc kubenswrapper[4962]: I1003 14:29:20.149335 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:29:20 crc kubenswrapper[4962]: I1003 14:29:20.237914 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed11faa-654c-4ea7-8f20-f4172d05cfb0" path="/var/lib/kubelet/pods/3ed11faa-654c-4ea7-8f20-f4172d05cfb0/volumes" Oct 03 14:29:20 crc kubenswrapper[4962]: I1003 14:29:20.319935 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74f65f90-9944-4bb5-a7dd-4f8fd8781be1","Type":"ContainerStarted","Data":"297149d6efc487e7078b650f6981497cb5ad3e620f7b103a58d3e584446f46b8"} Oct 03 14:29:21 crc kubenswrapper[4962]: I1003 14:29:21.332028 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74f65f90-9944-4bb5-a7dd-4f8fd8781be1","Type":"ContainerStarted","Data":"f691333483a4e16cc564fbbbf080805819d2859e3cd3c0593370c4d7b15402d9"} Oct 03 14:29:21 crc kubenswrapper[4962]: I1003 14:29:21.332330 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"74f65f90-9944-4bb5-a7dd-4f8fd8781be1","Type":"ContainerStarted","Data":"b81676c0c2ef5a2a51052bf9543e8b0a89b5c08292901cd83c1c7d94b0f66eb2"} Oct 03 14:29:21 crc kubenswrapper[4962]: I1003 14:29:21.365184 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.365161979 podStartE2EDuration="2.365161979s" podCreationTimestamp="2025-10-03 14:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:29:21.361908034 +0000 UTC m=+5969.765805879" watchObservedRunningTime="2025-10-03 14:29:21.365161979 +0000 UTC m=+5969.769059824" Oct 03 14:29:22 crc kubenswrapper[4962]: I1003 14:29:22.239287 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:29:22 crc kubenswrapper[4962]: E1003 14:29:22.239870 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:29:23 crc kubenswrapper[4962]: I1003 14:29:23.408436 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 14:29:24 crc kubenswrapper[4962]: I1003 14:29:24.703268 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 14:29:29 crc kubenswrapper[4962]: I1003 14:29:29.890923 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 14:29:36 crc kubenswrapper[4962]: I1003 14:29:36.227087 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:29:36 crc kubenswrapper[4962]: E1003 14:29:36.227593 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:29:47 crc kubenswrapper[4962]: I1003 14:29:47.227371 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:29:47 crc kubenswrapper[4962]: E1003 14:29:47.228142 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:29:53 crc kubenswrapper[4962]: I1003 14:29:53.054138 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wkp9s"] Oct 03 14:29:53 crc kubenswrapper[4962]: I1003 14:29:53.063531 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wkp9s"] Oct 03 14:29:54 crc kubenswrapper[4962]: I1003 14:29:54.244040 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c239b20-5824-46cc-8500-27c9e8e69e82" path="/var/lib/kubelet/pods/6c239b20-5824-46cc-8500-27c9e8e69e82/volumes" Oct 03 14:29:58 crc kubenswrapper[4962]: I1003 14:29:58.228243 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:29:58 crc kubenswrapper[4962]: E1003 14:29:58.229302 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.146181 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc"] Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.148282 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.150747 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.151318 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.167834 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc"] Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.330445 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvfjf\" (UniqueName: \"kubernetes.io/projected/da16c169-d7ba-493d-a542-e9e24ff3cef3-kube-api-access-tvfjf\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.330562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da16c169-d7ba-493d-a542-e9e24ff3cef3-secret-volume\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.330631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da16c169-d7ba-493d-a542-e9e24ff3cef3-config-volume\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.432681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvfjf\" (UniqueName: \"kubernetes.io/projected/da16c169-d7ba-493d-a542-e9e24ff3cef3-kube-api-access-tvfjf\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.432743 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da16c169-d7ba-493d-a542-e9e24ff3cef3-secret-volume\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.432765 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da16c169-d7ba-493d-a542-e9e24ff3cef3-config-volume\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.433713 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da16c169-d7ba-493d-a542-e9e24ff3cef3-config-volume\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.444730 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da16c169-d7ba-493d-a542-e9e24ff3cef3-secret-volume\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.452537 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvfjf\" (UniqueName: \"kubernetes.io/projected/da16c169-d7ba-493d-a542-e9e24ff3cef3-kube-api-access-tvfjf\") pod \"collect-profiles-29325030-2f6gc\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.489863 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:00 crc kubenswrapper[4962]: I1003 14:30:00.924935 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc"] Oct 03 14:30:01 crc kubenswrapper[4962]: I1003 14:30:01.526561 4962 scope.go:117] "RemoveContainer" containerID="30afdada302e7ca15b11d4303e4a1291ebd138839adb0a8f18f2f330f894e843" Oct 03 14:30:01 crc kubenswrapper[4962]: I1003 14:30:01.729384 4962 generic.go:334] "Generic (PLEG): container finished" podID="da16c169-d7ba-493d-a542-e9e24ff3cef3" containerID="0cc1b252d65cdceacd6b6f8b6d2073ad7a974e705daf3c675ea92ed058a9bb4d" exitCode=0 Oct 03 14:30:01 crc kubenswrapper[4962]: I1003 14:30:01.729476 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" event={"ID":"da16c169-d7ba-493d-a542-e9e24ff3cef3","Type":"ContainerDied","Data":"0cc1b252d65cdceacd6b6f8b6d2073ad7a974e705daf3c675ea92ed058a9bb4d"} Oct 03 14:30:01 crc kubenswrapper[4962]: I1003 14:30:01.729752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" event={"ID":"da16c169-d7ba-493d-a542-e9e24ff3cef3","Type":"ContainerStarted","Data":"ef665da499594b5a20248c7028151b2ae7ddc7c87b06a48274e8c604b08ef7e4"} Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.030340 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c495-account-create-92jpw"] Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.042283 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c495-account-create-92jpw"] Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.089021 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.183522 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvfjf\" (UniqueName: \"kubernetes.io/projected/da16c169-d7ba-493d-a542-e9e24ff3cef3-kube-api-access-tvfjf\") pod \"da16c169-d7ba-493d-a542-e9e24ff3cef3\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.183711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da16c169-d7ba-493d-a542-e9e24ff3cef3-config-volume\") pod \"da16c169-d7ba-493d-a542-e9e24ff3cef3\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.183849 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da16c169-d7ba-493d-a542-e9e24ff3cef3-secret-volume\") pod \"da16c169-d7ba-493d-a542-e9e24ff3cef3\" (UID: \"da16c169-d7ba-493d-a542-e9e24ff3cef3\") " Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.184449 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da16c169-d7ba-493d-a542-e9e24ff3cef3-config-volume" (OuterVolumeSpecName: "config-volume") pod "da16c169-d7ba-493d-a542-e9e24ff3cef3" (UID: "da16c169-d7ba-493d-a542-e9e24ff3cef3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.188749 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da16c169-d7ba-493d-a542-e9e24ff3cef3-kube-api-access-tvfjf" (OuterVolumeSpecName: "kube-api-access-tvfjf") pod "da16c169-d7ba-493d-a542-e9e24ff3cef3" (UID: "da16c169-d7ba-493d-a542-e9e24ff3cef3"). InnerVolumeSpecName "kube-api-access-tvfjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.189045 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da16c169-d7ba-493d-a542-e9e24ff3cef3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da16c169-d7ba-493d-a542-e9e24ff3cef3" (UID: "da16c169-d7ba-493d-a542-e9e24ff3cef3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.285531 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da16c169-d7ba-493d-a542-e9e24ff3cef3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.285568 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvfjf\" (UniqueName: \"kubernetes.io/projected/da16c169-d7ba-493d-a542-e9e24ff3cef3-kube-api-access-tvfjf\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.285577 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da16c169-d7ba-493d-a542-e9e24ff3cef3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.750892 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" event={"ID":"da16c169-d7ba-493d-a542-e9e24ff3cef3","Type":"ContainerDied","Data":"ef665da499594b5a20248c7028151b2ae7ddc7c87b06a48274e8c604b08ef7e4"} Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.750945 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef665da499594b5a20248c7028151b2ae7ddc7c87b06a48274e8c604b08ef7e4" Oct 03 14:30:03 crc kubenswrapper[4962]: I1003 14:30:03.750958 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-2f6gc" Oct 03 14:30:04 crc kubenswrapper[4962]: I1003 14:30:04.148501 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm"] Oct 03 14:30:04 crc kubenswrapper[4962]: I1003 14:30:04.156445 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324985-hxxwm"] Oct 03 14:30:04 crc kubenswrapper[4962]: I1003 14:30:04.243725 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39da5925-e1cc-43f5-b67f-5e14f3909a45" path="/var/lib/kubelet/pods/39da5925-e1cc-43f5-b67f-5e14f3909a45/volumes" Oct 03 14:30:04 crc kubenswrapper[4962]: I1003 14:30:04.244367 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2133e1-e19a-41ea-9a20-8653fb57c519" path="/var/lib/kubelet/pods/ac2133e1-e19a-41ea-9a20-8653fb57c519/volumes" Oct 03 14:30:10 crc kubenswrapper[4962]: I1003 14:30:10.031950 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bhvfh"] Oct 03 14:30:10 crc kubenswrapper[4962]: I1003 14:30:10.042330 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bhvfh"] Oct 03 14:30:10 crc kubenswrapper[4962]: I1003 14:30:10.240996 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7544721a-dd02-4e36-8660-402cb244510e" path="/var/lib/kubelet/pods/7544721a-dd02-4e36-8660-402cb244510e/volumes" Oct 03 14:30:13 crc kubenswrapper[4962]: I1003 14:30:13.227552 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:30:13 crc kubenswrapper[4962]: E1003 14:30:13.228314 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.058184 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rwskh"] Oct 03 14:30:17 crc kubenswrapper[4962]: E1003 14:30:17.071691 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da16c169-d7ba-493d-a542-e9e24ff3cef3" containerName="collect-profiles" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.071748 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="da16c169-d7ba-493d-a542-e9e24ff3cef3" containerName="collect-profiles" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.074696 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="da16c169-d7ba-493d-a542-e9e24ff3cef3" containerName="collect-profiles" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.083482 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.091776 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwskh"] Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.194020 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wkk\" (UniqueName: \"kubernetes.io/projected/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-kube-api-access-n5wkk\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.194119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-catalog-content\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.194168 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-utilities\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.295719 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-utilities\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.295915 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wkk\" (UniqueName: \"kubernetes.io/projected/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-kube-api-access-n5wkk\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.295989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-catalog-content\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.296495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-catalog-content\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.296511 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-utilities\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.316517 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wkk\" (UniqueName: \"kubernetes.io/projected/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-kube-api-access-n5wkk\") pod \"redhat-marketplace-rwskh\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.414392 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:17 crc kubenswrapper[4962]: I1003 14:30:17.929249 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwskh"] Oct 03 14:30:18 crc kubenswrapper[4962]: I1003 14:30:18.899511 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerID="058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da" exitCode=0 Oct 03 14:30:18 crc kubenswrapper[4962]: I1003 14:30:18.899592 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwskh" event={"ID":"ee101fbd-b51a-4f3d-92d5-f0e0338507e7","Type":"ContainerDied","Data":"058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da"} Oct 03 14:30:18 crc kubenswrapper[4962]: I1003 14:30:18.900039 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwskh" event={"ID":"ee101fbd-b51a-4f3d-92d5-f0e0338507e7","Type":"ContainerStarted","Data":"1f0ee1cec4eddb02fe4e381ba36e1965a20a927a66756ab2503decf11a9678da"} Oct 03 14:30:20 crc kubenswrapper[4962]: I1003 14:30:20.926803 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerID="762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42" exitCode=0 Oct 03 14:30:20 crc kubenswrapper[4962]: I1003 14:30:20.926880 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwskh" event={"ID":"ee101fbd-b51a-4f3d-92d5-f0e0338507e7","Type":"ContainerDied","Data":"762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42"} Oct 03 14:30:21 crc kubenswrapper[4962]: I1003 14:30:21.937621 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwskh" event={"ID":"ee101fbd-b51a-4f3d-92d5-f0e0338507e7","Type":"ContainerStarted","Data":"8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67"} Oct 03 14:30:21 crc kubenswrapper[4962]: I1003 14:30:21.960257 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rwskh" podStartSLOduration=2.3314690479999998 podStartE2EDuration="4.960238638s" podCreationTimestamp="2025-10-03 14:30:17 +0000 UTC" firstStartedPulling="2025-10-03 14:30:18.901741854 +0000 UTC m=+6027.305639689" lastFinishedPulling="2025-10-03 14:30:21.530511444 +0000 UTC m=+6029.934409279" observedRunningTime="2025-10-03 14:30:21.952684556 +0000 UTC m=+6030.356582411" watchObservedRunningTime="2025-10-03 14:30:21.960238638 +0000 UTC m=+6030.364136473" Oct 03 14:30:23 crc kubenswrapper[4962]: I1003 14:30:23.052876 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gz4fx"] Oct 03 14:30:23 crc kubenswrapper[4962]: I1003 14:30:23.060995 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gz4fx"] Oct 03 14:30:24 crc kubenswrapper[4962]: I1003 14:30:24.239965 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a67e74ba-efcb-4f16-930f-57335376321f" path="/var/lib/kubelet/pods/a67e74ba-efcb-4f16-930f-57335376321f/volumes" Oct 03 14:30:27 crc kubenswrapper[4962]: I1003 14:30:27.415227 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:27 crc kubenswrapper[4962]: I1003 14:30:27.415514 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:27 crc kubenswrapper[4962]: I1003 14:30:27.460177 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:28 crc kubenswrapper[4962]: I1003 14:30:28.021519 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:28 crc kubenswrapper[4962]: I1003 14:30:28.067948 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwskh"] Oct 03 14:30:28 crc kubenswrapper[4962]: I1003 14:30:28.227010 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:30:28 crc kubenswrapper[4962]: E1003 14:30:28.227592 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:30:29 crc kubenswrapper[4962]: I1003 14:30:29.995919 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rwskh" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerName="registry-server" containerID="cri-o://8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67" gracePeriod=2 Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.444917 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.545267 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-utilities\") pod \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.545708 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-catalog-content\") pod \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.545773 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5wkk\" (UniqueName: \"kubernetes.io/projected/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-kube-api-access-n5wkk\") pod \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\" (UID: \"ee101fbd-b51a-4f3d-92d5-f0e0338507e7\") " Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.547032 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-utilities" (OuterVolumeSpecName: "utilities") pod "ee101fbd-b51a-4f3d-92d5-f0e0338507e7" (UID: "ee101fbd-b51a-4f3d-92d5-f0e0338507e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.553356 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-kube-api-access-n5wkk" (OuterVolumeSpecName: "kube-api-access-n5wkk") pod "ee101fbd-b51a-4f3d-92d5-f0e0338507e7" (UID: "ee101fbd-b51a-4f3d-92d5-f0e0338507e7"). InnerVolumeSpecName "kube-api-access-n5wkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.561828 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee101fbd-b51a-4f3d-92d5-f0e0338507e7" (UID: "ee101fbd-b51a-4f3d-92d5-f0e0338507e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.647922 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.648192 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:30 crc kubenswrapper[4962]: I1003 14:30:30.648259 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5wkk\" (UniqueName: \"kubernetes.io/projected/ee101fbd-b51a-4f3d-92d5-f0e0338507e7-kube-api-access-n5wkk\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.012919 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerID="8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67" exitCode=0 Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.013050 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwskh" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.013059 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwskh" event={"ID":"ee101fbd-b51a-4f3d-92d5-f0e0338507e7","Type":"ContainerDied","Data":"8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67"} Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.013628 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwskh" event={"ID":"ee101fbd-b51a-4f3d-92d5-f0e0338507e7","Type":"ContainerDied","Data":"1f0ee1cec4eddb02fe4e381ba36e1965a20a927a66756ab2503decf11a9678da"} Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.013670 4962 scope.go:117] "RemoveContainer" containerID="8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.057088 4962 scope.go:117] "RemoveContainer" containerID="762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.062840 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwskh"] Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.070344 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwskh"] Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.082242 4962 scope.go:117] "RemoveContainer" containerID="058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.122555 4962 scope.go:117] "RemoveContainer" containerID="8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67" Oct 03 14:30:31 crc kubenswrapper[4962]: E1003 14:30:31.123428 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67\": container with ID starting with 8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67 not found: ID does not exist" containerID="8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.123536 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67"} err="failed to get container status \"8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67\": rpc error: code = NotFound desc = could not find container \"8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67\": container with ID starting with 8e6aa45527c588d5eac441f346e29f3eb8df8bb98b0205d39b79363f84cfcd67 not found: ID does not exist" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.123571 4962 scope.go:117] "RemoveContainer" containerID="762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42" Oct 03 14:30:31 crc kubenswrapper[4962]: E1003 14:30:31.124092 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42\": container with ID starting with 762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42 not found: ID does not exist" containerID="762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.124166 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42"} err="failed to get container status \"762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42\": rpc error: code = NotFound desc = could not find container \"762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42\": container with ID starting with 762439cc23951b020849cbd06dbd2557f33c9e6138cd2bd5ad6c96de06a8bf42 not found: ID does not exist" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.124218 4962 scope.go:117] "RemoveContainer" containerID="058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da" Oct 03 14:30:31 crc kubenswrapper[4962]: E1003 14:30:31.124592 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da\": container with ID starting with 058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da not found: ID does not exist" containerID="058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da" Oct 03 14:30:31 crc kubenswrapper[4962]: I1003 14:30:31.124626 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da"} err="failed to get container status \"058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da\": rpc error: code = NotFound desc = could not find container \"058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da\": container with ID starting with 058a9ee808904d4ed8a3844d23fa27bd53dfc171bd077b152701669d09d702da not found: ID does not exist" Oct 03 14:30:32 crc kubenswrapper[4962]: I1003 14:30:32.237214 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" path="/var/lib/kubelet/pods/ee101fbd-b51a-4f3d-92d5-f0e0338507e7/volumes" Oct 03 14:30:40 crc kubenswrapper[4962]: I1003 14:30:40.227402 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:30:40 crc kubenswrapper[4962]: E1003 14:30:40.228205 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:30:51 crc kubenswrapper[4962]: I1003 14:30:51.227089 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:30:51 crc kubenswrapper[4962]: E1003 14:30:51.227845 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:31:01 crc kubenswrapper[4962]: I1003 14:31:01.628850 4962 scope.go:117] "RemoveContainer" containerID="9691264e81c813dea2599d0b28e55c130f559b47eee52175143c7b1c43887bcf" Oct 03 14:31:01 crc kubenswrapper[4962]: I1003 14:31:01.665613 4962 scope.go:117] "RemoveContainer" containerID="6679adbbce4585a7df247ce019ba5e41cc535f28d99e5ca0c55d071aa6c055c8" Oct 03 14:31:01 crc kubenswrapper[4962]: I1003 14:31:01.698274 4962 scope.go:117] "RemoveContainer" containerID="343b79e147ce221a8bf78386d77274b50b1a972c873bc997abacdb002ab5f9d1" Oct 03 14:31:01 crc kubenswrapper[4962]: I1003 14:31:01.731650 4962 scope.go:117] "RemoveContainer" containerID="b44a99512bc5d00d2cd67f149868b4f5c5f65bfc3c2413e6b9d655726c00be3d" Oct 03 14:31:05 crc kubenswrapper[4962]: I1003 14:31:05.227687 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:31:05 crc kubenswrapper[4962]: E1003 14:31:05.228480 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.477188 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bbqrf"] Oct 03 14:31:12 crc kubenswrapper[4962]: E1003 14:31:12.479160 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerName="registry-server" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.479249 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerName="registry-server" Oct 03 14:31:12 crc kubenswrapper[4962]: E1003 14:31:12.479333 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerName="extract-content" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.479395 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerName="extract-content" Oct 03 14:31:12 crc kubenswrapper[4962]: E1003 14:31:12.479461 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerName="extract-utilities" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.479530 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerName="extract-utilities" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.479817 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee101fbd-b51a-4f3d-92d5-f0e0338507e7" containerName="registry-server" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.480594 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.482431 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7tdlv" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.483850 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.491628 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bbqrf"] Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.503239 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-p7pqn"] Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.505392 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.538411 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p7pqn"] Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.634414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-scripts\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.634948 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-etc-ovs\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.635147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-run-ovn\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.635314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-log-ovn\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.635510 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-log\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.635742 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nth8\" (UniqueName: \"kubernetes.io/projected/1c5fddb0-c21c-499d-b4c4-61a807e0c392-kube-api-access-6nth8\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.635801 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c5fddb0-c21c-499d-b4c4-61a807e0c392-scripts\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.635841 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-lib\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.635986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6fh4\" (UniqueName: \"kubernetes.io/projected/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-kube-api-access-n6fh4\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.636080 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-run\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.636158 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-run\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737522 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-scripts\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-etc-ovs\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737654 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-run-ovn\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737686 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-log-ovn\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737709 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-log\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nth8\" (UniqueName: \"kubernetes.io/projected/1c5fddb0-c21c-499d-b4c4-61a807e0c392-kube-api-access-6nth8\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737787 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c5fddb0-c21c-499d-b4c4-61a807e0c392-scripts\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737810 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-lib\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737829 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6fh4\" (UniqueName: \"kubernetes.io/projected/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-kube-api-access-n6fh4\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737847 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-run\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.737869 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-run\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.738133 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-run\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.738144 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-log-ovn\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.738183 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-etc-ovs\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.738224 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-run\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.738194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-lib\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.738270 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-var-log\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.738263 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c5fddb0-c21c-499d-b4c4-61a807e0c392-var-run-ovn\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.739525 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-scripts\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.740886 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c5fddb0-c21c-499d-b4c4-61a807e0c392-scripts\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.759257 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6fh4\" (UniqueName: \"kubernetes.io/projected/fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed-kube-api-access-n6fh4\") pod \"ovn-controller-ovs-p7pqn\" (UID: \"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed\") " pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.782559 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nth8\" (UniqueName: \"kubernetes.io/projected/1c5fddb0-c21c-499d-b4c4-61a807e0c392-kube-api-access-6nth8\") pod \"ovn-controller-bbqrf\" (UID: \"1c5fddb0-c21c-499d-b4c4-61a807e0c392\") " pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.807461 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:12 crc kubenswrapper[4962]: I1003 14:31:12.825237 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:13 crc kubenswrapper[4962]: I1003 14:31:13.270480 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bbqrf"] Oct 03 14:31:13 crc kubenswrapper[4962]: I1003 14:31:13.374412 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bbqrf" event={"ID":"1c5fddb0-c21c-499d-b4c4-61a807e0c392","Type":"ContainerStarted","Data":"fe03f1485c0ad4cd04c2e7d168cc23da8e8cc8d8f59f4786c543ed7c747839cf"} Oct 03 14:31:13 crc kubenswrapper[4962]: I1003 14:31:13.692519 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p7pqn"] Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.054177 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-s5l2q"] Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.055472 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.060403 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.091579 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-s5l2q"] Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.166539 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-ovn-rundir\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.166632 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-config\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.166738 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-ovs-rundir\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.167084 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktf2c\" (UniqueName: \"kubernetes.io/projected/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-kube-api-access-ktf2c\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.268326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-config\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.268441 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-ovs-rundir\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.268497 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktf2c\" (UniqueName: \"kubernetes.io/projected/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-kube-api-access-ktf2c\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.268523 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-ovn-rundir\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.268805 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-ovn-rundir\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.268840 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-ovs-rundir\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.269900 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-config\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.289217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktf2c\" (UniqueName: \"kubernetes.io/projected/fec5fe67-1a3c-4be8-bd47-7fe5597e9399-kube-api-access-ktf2c\") pod \"ovn-controller-metrics-s5l2q\" (UID: \"fec5fe67-1a3c-4be8-bd47-7fe5597e9399\") " pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.384625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bbqrf" event={"ID":"1c5fddb0-c21c-499d-b4c4-61a807e0c392","Type":"ContainerStarted","Data":"53d8c7b6d11094c7c63d2cd05324912cc35d9c842bdd151526f159c84820e880"} Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.384732 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.385410 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-s5l2q" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.386435 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p7pqn" event={"ID":"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed","Type":"ContainerStarted","Data":"ae4dc024c8678f78a6284e842a437afcc208e6d98c8861c0195f0f2a5e9a86d1"} Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.386473 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p7pqn" event={"ID":"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed","Type":"ContainerStarted","Data":"08b2a38cd5351011b5b329bf59254ce997a826cb41721fe98d302886c297fb08"} Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.410138 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bbqrf" podStartSLOduration=2.410116338 podStartE2EDuration="2.410116338s" podCreationTimestamp="2025-10-03 14:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:31:14.406021419 +0000 UTC m=+6082.809919274" watchObservedRunningTime="2025-10-03 14:31:14.410116338 +0000 UTC m=+6082.814014173" Oct 03 14:31:14 crc kubenswrapper[4962]: I1003 14:31:14.837009 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-s5l2q"] Oct 03 14:31:15 crc kubenswrapper[4962]: I1003 14:31:15.395964 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-s5l2q" event={"ID":"fec5fe67-1a3c-4be8-bd47-7fe5597e9399","Type":"ContainerStarted","Data":"8f79feba86b616b6994813da26b5c9cf8aaa14836d46e3a161e43402655da9b0"} Oct 03 14:31:15 crc kubenswrapper[4962]: I1003 14:31:15.396335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-s5l2q" event={"ID":"fec5fe67-1a3c-4be8-bd47-7fe5597e9399","Type":"ContainerStarted","Data":"57c26d26b8abd7d69b2420e39bc0037092e06fe230fbff0b240ab9f3d551a6db"} Oct 03 14:31:15 crc kubenswrapper[4962]: I1003 14:31:15.397828 4962 generic.go:334] "Generic (PLEG): container finished" podID="fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed" containerID="ae4dc024c8678f78a6284e842a437afcc208e6d98c8861c0195f0f2a5e9a86d1" exitCode=0 Oct 03 14:31:15 crc kubenswrapper[4962]: I1003 14:31:15.397922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p7pqn" event={"ID":"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed","Type":"ContainerDied","Data":"ae4dc024c8678f78a6284e842a437afcc208e6d98c8861c0195f0f2a5e9a86d1"} Oct 03 14:31:15 crc kubenswrapper[4962]: I1003 14:31:15.420162 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-s5l2q" podStartSLOduration=1.420139514 podStartE2EDuration="1.420139514s" podCreationTimestamp="2025-10-03 14:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:31:15.412152861 +0000 UTC m=+6083.816050716" watchObservedRunningTime="2025-10-03 14:31:15.420139514 +0000 UTC m=+6083.824037349" Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.270178 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-752wv"] Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.272324 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-752wv" Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.282555 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-752wv"] Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.410361 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42jwn\" (UniqueName: \"kubernetes.io/projected/3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd-kube-api-access-42jwn\") pod \"octavia-db-create-752wv\" (UID: \"3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd\") " pod="openstack/octavia-db-create-752wv" Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.422600 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p7pqn" event={"ID":"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed","Type":"ContainerStarted","Data":"b0f34e09d881da85d6adc66b05ac99ac237890730c95bab7268c47b6625ae2b9"} Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.422661 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p7pqn" event={"ID":"fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed","Type":"ContainerStarted","Data":"30032fdb3ddd117ea86d95ca32375533ff381c9bf9cda8bb4b2765cf44b25f14"} Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.422694 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.422713 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.450313 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-p7pqn" podStartSLOduration=4.450295897 podStartE2EDuration="4.450295897s" podCreationTimestamp="2025-10-03 14:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:31:16.442879269 +0000 UTC m=+6084.846777104" watchObservedRunningTime="2025-10-03 14:31:16.450295897 +0000 UTC m=+6084.854193732" Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.512544 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42jwn\" (UniqueName: \"kubernetes.io/projected/3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd-kube-api-access-42jwn\") pod \"octavia-db-create-752wv\" (UID: \"3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd\") " pod="openstack/octavia-db-create-752wv" Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.540864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42jwn\" (UniqueName: \"kubernetes.io/projected/3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd-kube-api-access-42jwn\") pod \"octavia-db-create-752wv\" (UID: \"3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd\") " pod="openstack/octavia-db-create-752wv" Oct 03 14:31:16 crc kubenswrapper[4962]: I1003 14:31:16.590258 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-752wv" Oct 03 14:31:17 crc kubenswrapper[4962]: I1003 14:31:17.037933 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-752wv"] Oct 03 14:31:17 crc kubenswrapper[4962]: W1003 14:31:17.044549 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cc58d2d_aac1_43a9_bbbd_3c2f6bdac4fd.slice/crio-4c4b7d9f55dedd012a0636f6bd2af67d7b6b53d9958ed0bc302cddb9218e68e9 WatchSource:0}: Error finding container 4c4b7d9f55dedd012a0636f6bd2af67d7b6b53d9958ed0bc302cddb9218e68e9: Status 404 returned error can't find the container with id 4c4b7d9f55dedd012a0636f6bd2af67d7b6b53d9958ed0bc302cddb9218e68e9 Oct 03 14:31:17 crc kubenswrapper[4962]: I1003 14:31:17.431630 4962 generic.go:334] "Generic (PLEG): container finished" podID="3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd" containerID="b169f05aa8f299a544bfcf9a5d61dc22ead3731932e101f65a740d39a7567964" exitCode=0 Oct 03 14:31:17 crc kubenswrapper[4962]: I1003 14:31:17.431999 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-752wv" event={"ID":"3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd","Type":"ContainerDied","Data":"b169f05aa8f299a544bfcf9a5d61dc22ead3731932e101f65a740d39a7567964"} Oct 03 14:31:17 crc kubenswrapper[4962]: I1003 14:31:17.432062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-752wv" event={"ID":"3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd","Type":"ContainerStarted","Data":"4c4b7d9f55dedd012a0636f6bd2af67d7b6b53d9958ed0bc302cddb9218e68e9"} Oct 03 14:31:18 crc kubenswrapper[4962]: I1003 14:31:18.798353 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-752wv" Oct 03 14:31:18 crc kubenswrapper[4962]: I1003 14:31:18.872066 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42jwn\" (UniqueName: \"kubernetes.io/projected/3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd-kube-api-access-42jwn\") pod \"3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd\" (UID: \"3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd\") " Oct 03 14:31:18 crc kubenswrapper[4962]: I1003 14:31:18.878372 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd-kube-api-access-42jwn" (OuterVolumeSpecName: "kube-api-access-42jwn") pod "3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd" (UID: "3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd"). InnerVolumeSpecName "kube-api-access-42jwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:31:18 crc kubenswrapper[4962]: I1003 14:31:18.974029 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42jwn\" (UniqueName: \"kubernetes.io/projected/3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd-kube-api-access-42jwn\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:19 crc kubenswrapper[4962]: I1003 14:31:19.227088 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:31:19 crc kubenswrapper[4962]: E1003 14:31:19.227349 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:31:19 crc kubenswrapper[4962]: I1003 14:31:19.453375 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-752wv" event={"ID":"3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd","Type":"ContainerDied","Data":"4c4b7d9f55dedd012a0636f6bd2af67d7b6b53d9958ed0bc302cddb9218e68e9"} Oct 03 14:31:19 crc kubenswrapper[4962]: I1003 14:31:19.453419 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c4b7d9f55dedd012a0636f6bd2af67d7b6b53d9958ed0bc302cddb9218e68e9" Oct 03 14:31:19 crc kubenswrapper[4962]: I1003 14:31:19.453481 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-752wv" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.327988 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-145e-account-create-thj75"] Oct 03 14:31:28 crc kubenswrapper[4962]: E1003 14:31:28.328949 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd" containerName="mariadb-database-create" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.328962 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd" containerName="mariadb-database-create" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.329169 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd" containerName="mariadb-database-create" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.329832 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-145e-account-create-thj75" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.333192 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.341117 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-145e-account-create-thj75"] Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.459148 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx776\" (UniqueName: \"kubernetes.io/projected/228a96ce-8078-4c26-b1d7-3076c08ce289-kube-api-access-dx776\") pod \"octavia-145e-account-create-thj75\" (UID: \"228a96ce-8078-4c26-b1d7-3076c08ce289\") " pod="openstack/octavia-145e-account-create-thj75" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.560805 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx776\" (UniqueName: \"kubernetes.io/projected/228a96ce-8078-4c26-b1d7-3076c08ce289-kube-api-access-dx776\") pod \"octavia-145e-account-create-thj75\" (UID: \"228a96ce-8078-4c26-b1d7-3076c08ce289\") " pod="openstack/octavia-145e-account-create-thj75" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.579152 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx776\" (UniqueName: \"kubernetes.io/projected/228a96ce-8078-4c26-b1d7-3076c08ce289-kube-api-access-dx776\") pod \"octavia-145e-account-create-thj75\" (UID: \"228a96ce-8078-4c26-b1d7-3076c08ce289\") " pod="openstack/octavia-145e-account-create-thj75" Oct 03 14:31:28 crc kubenswrapper[4962]: I1003 14:31:28.683169 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-145e-account-create-thj75" Oct 03 14:31:29 crc kubenswrapper[4962]: I1003 14:31:29.135675 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-145e-account-create-thj75"] Oct 03 14:31:29 crc kubenswrapper[4962]: I1003 14:31:29.558605 4962 generic.go:334] "Generic (PLEG): container finished" podID="228a96ce-8078-4c26-b1d7-3076c08ce289" containerID="5a1bdff7bb38514a8be6c5269d8f2629161b89a24aecf1b8bf4deedd63e6bf78" exitCode=0 Oct 03 14:31:29 crc kubenswrapper[4962]: I1003 14:31:29.558826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-145e-account-create-thj75" event={"ID":"228a96ce-8078-4c26-b1d7-3076c08ce289","Type":"ContainerDied","Data":"5a1bdff7bb38514a8be6c5269d8f2629161b89a24aecf1b8bf4deedd63e6bf78"} Oct 03 14:31:29 crc kubenswrapper[4962]: I1003 14:31:29.559142 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-145e-account-create-thj75" event={"ID":"228a96ce-8078-4c26-b1d7-3076c08ce289","Type":"ContainerStarted","Data":"ab54adf33bfcac18b73ba964831bd75f0e05327c62544ed9f08c30b97155e5fc"} Oct 03 14:31:30 crc kubenswrapper[4962]: I1003 14:31:30.227152 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:31:30 crc kubenswrapper[4962]: I1003 14:31:30.568774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"075a3a2d68fc05b9db35c066365380f2ff374a7b6a1faec1634013ff945a759f"} Oct 03 14:31:30 crc kubenswrapper[4962]: I1003 14:31:30.930104 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-145e-account-create-thj75" Oct 03 14:31:31 crc kubenswrapper[4962]: I1003 14:31:31.014393 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx776\" (UniqueName: \"kubernetes.io/projected/228a96ce-8078-4c26-b1d7-3076c08ce289-kube-api-access-dx776\") pod \"228a96ce-8078-4c26-b1d7-3076c08ce289\" (UID: \"228a96ce-8078-4c26-b1d7-3076c08ce289\") " Oct 03 14:31:31 crc kubenswrapper[4962]: I1003 14:31:31.034360 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228a96ce-8078-4c26-b1d7-3076c08ce289-kube-api-access-dx776" (OuterVolumeSpecName: "kube-api-access-dx776") pod "228a96ce-8078-4c26-b1d7-3076c08ce289" (UID: "228a96ce-8078-4c26-b1d7-3076c08ce289"). InnerVolumeSpecName "kube-api-access-dx776". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:31:31 crc kubenswrapper[4962]: I1003 14:31:31.116527 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx776\" (UniqueName: \"kubernetes.io/projected/228a96ce-8078-4c26-b1d7-3076c08ce289-kube-api-access-dx776\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:31 crc kubenswrapper[4962]: I1003 14:31:31.579119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-145e-account-create-thj75" event={"ID":"228a96ce-8078-4c26-b1d7-3076c08ce289","Type":"ContainerDied","Data":"ab54adf33bfcac18b73ba964831bd75f0e05327c62544ed9f08c30b97155e5fc"} Oct 03 14:31:31 crc kubenswrapper[4962]: I1003 14:31:31.579458 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab54adf33bfcac18b73ba964831bd75f0e05327c62544ed9f08c30b97155e5fc" Oct 03 14:31:31 crc kubenswrapper[4962]: I1003 14:31:31.579158 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-145e-account-create-thj75" Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.328653 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-ztn25"] Oct 03 14:31:34 crc kubenswrapper[4962]: E1003 14:31:34.329859 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228a96ce-8078-4c26-b1d7-3076c08ce289" containerName="mariadb-account-create" Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.329874 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="228a96ce-8078-4c26-b1d7-3076c08ce289" containerName="mariadb-account-create" Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.330099 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="228a96ce-8078-4c26-b1d7-3076c08ce289" containerName="mariadb-account-create" Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.330846 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-ztn25" Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.340024 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-ztn25"] Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.478308 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ldk7\" (UniqueName: \"kubernetes.io/projected/4c7a61fb-8407-4c74-8b41-896d4358bd99-kube-api-access-4ldk7\") pod \"octavia-persistence-db-create-ztn25\" (UID: \"4c7a61fb-8407-4c74-8b41-896d4358bd99\") " pod="openstack/octavia-persistence-db-create-ztn25" Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.579927 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ldk7\" (UniqueName: \"kubernetes.io/projected/4c7a61fb-8407-4c74-8b41-896d4358bd99-kube-api-access-4ldk7\") pod \"octavia-persistence-db-create-ztn25\" (UID: \"4c7a61fb-8407-4c74-8b41-896d4358bd99\") " pod="openstack/octavia-persistence-db-create-ztn25" Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.605980 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ldk7\" (UniqueName: \"kubernetes.io/projected/4c7a61fb-8407-4c74-8b41-896d4358bd99-kube-api-access-4ldk7\") pod \"octavia-persistence-db-create-ztn25\" (UID: \"4c7a61fb-8407-4c74-8b41-896d4358bd99\") " pod="openstack/octavia-persistence-db-create-ztn25" Oct 03 14:31:34 crc kubenswrapper[4962]: I1003 14:31:34.658091 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-ztn25" Oct 03 14:31:35 crc kubenswrapper[4962]: I1003 14:31:35.201685 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-ztn25"] Oct 03 14:31:35 crc kubenswrapper[4962]: I1003 14:31:35.612136 4962 generic.go:334] "Generic (PLEG): container finished" podID="4c7a61fb-8407-4c74-8b41-896d4358bd99" containerID="8e26f9f44b9e894b8da15898b7edcb7c8ab0f116f64b33193c2d0c31ddbd26f9" exitCode=0 Oct 03 14:31:35 crc kubenswrapper[4962]: I1003 14:31:35.612510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-ztn25" event={"ID":"4c7a61fb-8407-4c74-8b41-896d4358bd99","Type":"ContainerDied","Data":"8e26f9f44b9e894b8da15898b7edcb7c8ab0f116f64b33193c2d0c31ddbd26f9"} Oct 03 14:31:35 crc kubenswrapper[4962]: I1003 14:31:35.612535 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-ztn25" event={"ID":"4c7a61fb-8407-4c74-8b41-896d4358bd99","Type":"ContainerStarted","Data":"11ab47e994b316f6fe3725fdd81f9af1f4dbc059f5ef479ea0f29be6465c0bcc"} Oct 03 14:31:36 crc kubenswrapper[4962]: I1003 14:31:36.969989 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-ztn25" Oct 03 14:31:37 crc kubenswrapper[4962]: I1003 14:31:37.071532 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ldk7\" (UniqueName: \"kubernetes.io/projected/4c7a61fb-8407-4c74-8b41-896d4358bd99-kube-api-access-4ldk7\") pod \"4c7a61fb-8407-4c74-8b41-896d4358bd99\" (UID: \"4c7a61fb-8407-4c74-8b41-896d4358bd99\") " Oct 03 14:31:37 crc kubenswrapper[4962]: I1003 14:31:37.077583 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7a61fb-8407-4c74-8b41-896d4358bd99-kube-api-access-4ldk7" (OuterVolumeSpecName: "kube-api-access-4ldk7") pod "4c7a61fb-8407-4c74-8b41-896d4358bd99" (UID: "4c7a61fb-8407-4c74-8b41-896d4358bd99"). InnerVolumeSpecName "kube-api-access-4ldk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:31:37 crc kubenswrapper[4962]: I1003 14:31:37.174110 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ldk7\" (UniqueName: \"kubernetes.io/projected/4c7a61fb-8407-4c74-8b41-896d4358bd99-kube-api-access-4ldk7\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:37 crc kubenswrapper[4962]: I1003 14:31:37.628992 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-ztn25" event={"ID":"4c7a61fb-8407-4c74-8b41-896d4358bd99","Type":"ContainerDied","Data":"11ab47e994b316f6fe3725fdd81f9af1f4dbc059f5ef479ea0f29be6465c0bcc"} Oct 03 14:31:37 crc kubenswrapper[4962]: I1003 14:31:37.629296 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ab47e994b316f6fe3725fdd81f9af1f4dbc059f5ef479ea0f29be6465c0bcc" Oct 03 14:31:37 crc kubenswrapper[4962]: I1003 14:31:37.629349 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-ztn25" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.500415 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-265e-account-create-v5grn"] Oct 03 14:31:45 crc kubenswrapper[4962]: E1003 14:31:45.501472 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7a61fb-8407-4c74-8b41-896d4358bd99" containerName="mariadb-database-create" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.501491 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7a61fb-8407-4c74-8b41-896d4358bd99" containerName="mariadb-database-create" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.501781 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7a61fb-8407-4c74-8b41-896d4358bd99" containerName="mariadb-database-create" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.502612 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-265e-account-create-v5grn" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.504425 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.536768 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-265e-account-create-v5grn"] Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.638658 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfmsl\" (UniqueName: \"kubernetes.io/projected/e1e67e61-5d84-4b90-9d46-60a5ddb1e357-kube-api-access-rfmsl\") pod \"octavia-265e-account-create-v5grn\" (UID: \"e1e67e61-5d84-4b90-9d46-60a5ddb1e357\") " pod="openstack/octavia-265e-account-create-v5grn" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.740826 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfmsl\" (UniqueName: \"kubernetes.io/projected/e1e67e61-5d84-4b90-9d46-60a5ddb1e357-kube-api-access-rfmsl\") pod \"octavia-265e-account-create-v5grn\" (UID: \"e1e67e61-5d84-4b90-9d46-60a5ddb1e357\") " pod="openstack/octavia-265e-account-create-v5grn" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.761451 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfmsl\" (UniqueName: \"kubernetes.io/projected/e1e67e61-5d84-4b90-9d46-60a5ddb1e357-kube-api-access-rfmsl\") pod \"octavia-265e-account-create-v5grn\" (UID: \"e1e67e61-5d84-4b90-9d46-60a5ddb1e357\") " pod="openstack/octavia-265e-account-create-v5grn" Oct 03 14:31:45 crc kubenswrapper[4962]: I1003 14:31:45.827178 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-265e-account-create-v5grn" Oct 03 14:31:46 crc kubenswrapper[4962]: I1003 14:31:46.099203 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-265e-account-create-v5grn"] Oct 03 14:31:46 crc kubenswrapper[4962]: I1003 14:31:46.714052 4962 generic.go:334] "Generic (PLEG): container finished" podID="e1e67e61-5d84-4b90-9d46-60a5ddb1e357" containerID="ca2295248d66644e2b73ef73c1e4e45f9c23c971a2253e1303fd3cdd9bd1c747" exitCode=0 Oct 03 14:31:46 crc kubenswrapper[4962]: I1003 14:31:46.714134 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-265e-account-create-v5grn" event={"ID":"e1e67e61-5d84-4b90-9d46-60a5ddb1e357","Type":"ContainerDied","Data":"ca2295248d66644e2b73ef73c1e4e45f9c23c971a2253e1303fd3cdd9bd1c747"} Oct 03 14:31:46 crc kubenswrapper[4962]: I1003 14:31:46.715530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-265e-account-create-v5grn" event={"ID":"e1e67e61-5d84-4b90-9d46-60a5ddb1e357","Type":"ContainerStarted","Data":"796eb92ee9809fcd67dcec3302a818ea61c64380108d6458d444868ae0b0326a"} Oct 03 14:31:47 crc kubenswrapper[4962]: I1003 14:31:47.847948 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bbqrf" podUID="1c5fddb0-c21c-499d-b4c4-61a807e0c392" containerName="ovn-controller" probeResult="failure" output=< Oct 03 14:31:47 crc kubenswrapper[4962]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 14:31:47 crc kubenswrapper[4962]: > Oct 03 14:31:47 crc kubenswrapper[4962]: I1003 14:31:47.875175 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:47 crc kubenswrapper[4962]: I1003 14:31:47.879441 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p7pqn" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.011111 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bbqrf-config-qpvqp"] Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.013249 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.017061 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.019417 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bbqrf-config-qpvqp"] Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.073379 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-265e-account-create-v5grn" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.082015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.082145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-log-ovn\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.082221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gtj5\" (UniqueName: \"kubernetes.io/projected/02e859a0-e617-48af-b83b-a14a4e23296e-kube-api-access-5gtj5\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.082253 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-additional-scripts\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.082301 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-scripts\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.082408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run-ovn\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.183919 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfmsl\" (UniqueName: \"kubernetes.io/projected/e1e67e61-5d84-4b90-9d46-60a5ddb1e357-kube-api-access-rfmsl\") pod \"e1e67e61-5d84-4b90-9d46-60a5ddb1e357\" (UID: \"e1e67e61-5d84-4b90-9d46-60a5ddb1e357\") " Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.184820 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-log-ovn\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.184993 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gtj5\" (UniqueName: \"kubernetes.io/projected/02e859a0-e617-48af-b83b-a14a4e23296e-kube-api-access-5gtj5\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.185110 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-additional-scripts\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.185187 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-log-ovn\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.185298 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-scripts\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.185448 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run-ovn\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.185561 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.185814 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.185874 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run-ovn\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.186147 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-additional-scripts\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.189884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e67e61-5d84-4b90-9d46-60a5ddb1e357-kube-api-access-rfmsl" (OuterVolumeSpecName: "kube-api-access-rfmsl") pod "e1e67e61-5d84-4b90-9d46-60a5ddb1e357" (UID: "e1e67e61-5d84-4b90-9d46-60a5ddb1e357"). InnerVolumeSpecName "kube-api-access-rfmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.191909 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-scripts\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.202415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gtj5\" (UniqueName: \"kubernetes.io/projected/02e859a0-e617-48af-b83b-a14a4e23296e-kube-api-access-5gtj5\") pod \"ovn-controller-bbqrf-config-qpvqp\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.289589 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfmsl\" (UniqueName: \"kubernetes.io/projected/e1e67e61-5d84-4b90-9d46-60a5ddb1e357-kube-api-access-rfmsl\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.385042 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:48 crc kubenswrapper[4962]: E1003 14:31:48.417376 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e67e61_5d84_4b90_9d46_60a5ddb1e357.slice/crio-796eb92ee9809fcd67dcec3302a818ea61c64380108d6458d444868ae0b0326a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e67e61_5d84_4b90_9d46_60a5ddb1e357.slice\": RecentStats: unable to find data in memory cache]" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.733040 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-265e-account-create-v5grn" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.733239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-265e-account-create-v5grn" event={"ID":"e1e67e61-5d84-4b90-9d46-60a5ddb1e357","Type":"ContainerDied","Data":"796eb92ee9809fcd67dcec3302a818ea61c64380108d6458d444868ae0b0326a"} Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.733693 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="796eb92ee9809fcd67dcec3302a818ea61c64380108d6458d444868ae0b0326a" Oct 03 14:31:48 crc kubenswrapper[4962]: I1003 14:31:48.818467 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bbqrf-config-qpvqp"] Oct 03 14:31:48 crc kubenswrapper[4962]: W1003 14:31:48.823545 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e859a0_e617_48af_b83b_a14a4e23296e.slice/crio-dc3d42f301304789c8a7d6ebbaac6aaf59a41adae6f4900181190fccc2ec92db WatchSource:0}: Error finding container dc3d42f301304789c8a7d6ebbaac6aaf59a41adae6f4900181190fccc2ec92db: Status 404 returned error can't find the container with id dc3d42f301304789c8a7d6ebbaac6aaf59a41adae6f4900181190fccc2ec92db Oct 03 14:31:49 crc kubenswrapper[4962]: I1003 14:31:49.754488 4962 generic.go:334] "Generic (PLEG): container finished" podID="02e859a0-e617-48af-b83b-a14a4e23296e" containerID="74b83a3f091c631c385d468a8b1b295415938af2c71ed0c84c74ebcbd183ff16" exitCode=0 Oct 03 14:31:49 crc kubenswrapper[4962]: I1003 14:31:49.754843 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bbqrf-config-qpvqp" event={"ID":"02e859a0-e617-48af-b83b-a14a4e23296e","Type":"ContainerDied","Data":"74b83a3f091c631c385d468a8b1b295415938af2c71ed0c84c74ebcbd183ff16"} Oct 03 14:31:49 crc kubenswrapper[4962]: I1003 14:31:49.754964 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bbqrf-config-qpvqp" event={"ID":"02e859a0-e617-48af-b83b-a14a4e23296e","Type":"ContainerStarted","Data":"dc3d42f301304789c8a7d6ebbaac6aaf59a41adae6f4900181190fccc2ec92db"} Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.088108 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241034 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run\") pod \"02e859a0-e617-48af-b83b-a14a4e23296e\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241105 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-scripts\") pod \"02e859a0-e617-48af-b83b-a14a4e23296e\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241165 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-log-ovn\") pod \"02e859a0-e617-48af-b83b-a14a4e23296e\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241152 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run" (OuterVolumeSpecName: "var-run") pod "02e859a0-e617-48af-b83b-a14a4e23296e" (UID: "02e859a0-e617-48af-b83b-a14a4e23296e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241235 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-additional-scripts\") pod \"02e859a0-e617-48af-b83b-a14a4e23296e\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241289 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "02e859a0-e617-48af-b83b-a14a4e23296e" (UID: "02e859a0-e617-48af-b83b-a14a4e23296e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241424 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run-ovn\") pod \"02e859a0-e617-48af-b83b-a14a4e23296e\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241504 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gtj5\" (UniqueName: \"kubernetes.io/projected/02e859a0-e617-48af-b83b-a14a4e23296e-kube-api-access-5gtj5\") pod \"02e859a0-e617-48af-b83b-a14a4e23296e\" (UID: \"02e859a0-e617-48af-b83b-a14a4e23296e\") " Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.241546 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "02e859a0-e617-48af-b83b-a14a4e23296e" (UID: "02e859a0-e617-48af-b83b-a14a4e23296e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.242089 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.242110 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.242122 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/02e859a0-e617-48af-b83b-a14a4e23296e-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.242205 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "02e859a0-e617-48af-b83b-a14a4e23296e" (UID: "02e859a0-e617-48af-b83b-a14a4e23296e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.242444 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-scripts" (OuterVolumeSpecName: "scripts") pod "02e859a0-e617-48af-b83b-a14a4e23296e" (UID: "02e859a0-e617-48af-b83b-a14a4e23296e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.247679 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e859a0-e617-48af-b83b-a14a4e23296e-kube-api-access-5gtj5" (OuterVolumeSpecName: "kube-api-access-5gtj5") pod "02e859a0-e617-48af-b83b-a14a4e23296e" (UID: "02e859a0-e617-48af-b83b-a14a4e23296e"). InnerVolumeSpecName "kube-api-access-5gtj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.344709 4962 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.344746 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gtj5\" (UniqueName: \"kubernetes.io/projected/02e859a0-e617-48af-b83b-a14a4e23296e-kube-api-access-5gtj5\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.344760 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02e859a0-e617-48af-b83b-a14a4e23296e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.771742 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bbqrf-config-qpvqp" event={"ID":"02e859a0-e617-48af-b83b-a14a4e23296e","Type":"ContainerDied","Data":"dc3d42f301304789c8a7d6ebbaac6aaf59a41adae6f4900181190fccc2ec92db"} Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.771791 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc3d42f301304789c8a7d6ebbaac6aaf59a41adae6f4900181190fccc2ec92db" Oct 03 14:31:51 crc kubenswrapper[4962]: I1003 14:31:51.771819 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bbqrf-config-qpvqp" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.165709 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bbqrf-config-qpvqp"] Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.176941 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bbqrf-config-qpvqp"] Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.270054 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e859a0-e617-48af-b83b-a14a4e23296e" path="/var/lib/kubelet/pods/02e859a0-e617-48af-b83b-a14a4e23296e/volumes" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.400990 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-6f6fb44c8b-xbxhh"] Oct 03 14:31:52 crc kubenswrapper[4962]: E1003 14:31:52.401355 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e859a0-e617-48af-b83b-a14a4e23296e" containerName="ovn-config" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.401366 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e859a0-e617-48af-b83b-a14a4e23296e" containerName="ovn-config" Oct 03 14:31:52 crc kubenswrapper[4962]: E1003 14:31:52.401382 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e67e61-5d84-4b90-9d46-60a5ddb1e357" containerName="mariadb-account-create" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.401388 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e67e61-5d84-4b90-9d46-60a5ddb1e357" containerName="mariadb-account-create" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.401591 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e67e61-5d84-4b90-9d46-60a5ddb1e357" containerName="mariadb-account-create" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.401612 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e859a0-e617-48af-b83b-a14a4e23296e" containerName="ovn-config" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.402997 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.407616 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-lwwkb" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.407850 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.408036 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.428741 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6f6fb44c8b-xbxhh"] Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.464829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/685d50a6-2ad8-4462-9526-193412684ac5-octavia-run\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.465081 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/685d50a6-2ad8-4462-9526-193412684ac5-config-data-merged\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.465437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-scripts\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.465477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-config-data\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.465500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-combined-ca-bundle\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.567381 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-scripts\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.567438 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-config-data\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.567460 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-combined-ca-bundle\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.567517 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/685d50a6-2ad8-4462-9526-193412684ac5-octavia-run\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.567548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/685d50a6-2ad8-4462-9526-193412684ac5-config-data-merged\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.568179 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/685d50a6-2ad8-4462-9526-193412684ac5-octavia-run\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.568179 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/685d50a6-2ad8-4462-9526-193412684ac5-config-data-merged\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.572385 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-scripts\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.574059 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-combined-ca-bundle\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.589053 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685d50a6-2ad8-4462-9526-193412684ac5-config-data\") pod \"octavia-api-6f6fb44c8b-xbxhh\" (UID: \"685d50a6-2ad8-4462-9526-193412684ac5\") " pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.725388 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:31:52 crc kubenswrapper[4962]: I1003 14:31:52.911543 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-bbqrf" Oct 03 14:31:53 crc kubenswrapper[4962]: I1003 14:31:53.373506 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6f6fb44c8b-xbxhh"] Oct 03 14:31:53 crc kubenswrapper[4962]: W1003 14:31:53.472022 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod685d50a6_2ad8_4462_9526_193412684ac5.slice/crio-aeff1e09a589c82ab931d55b77a55d9d8fcadf8ae4a3cb0c141770bced50f30e WatchSource:0}: Error finding container aeff1e09a589c82ab931d55b77a55d9d8fcadf8ae4a3cb0c141770bced50f30e: Status 404 returned error can't find the container with id aeff1e09a589c82ab931d55b77a55d9d8fcadf8ae4a3cb0c141770bced50f30e Oct 03 14:31:53 crc kubenswrapper[4962]: I1003 14:31:53.807267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" event={"ID":"685d50a6-2ad8-4462-9526-193412684ac5","Type":"ContainerStarted","Data":"aeff1e09a589c82ab931d55b77a55d9d8fcadf8ae4a3cb0c141770bced50f30e"} Oct 03 14:32:01 crc kubenswrapper[4962]: I1003 14:32:01.870716 4962 scope.go:117] "RemoveContainer" containerID="c49cbfe3abd20040f4005810cb1e8438c9700d23f46975313c122b4ae32695f1" Oct 03 14:32:05 crc kubenswrapper[4962]: I1003 14:32:05.710140 4962 scope.go:117] "RemoveContainer" containerID="c46c4ea63ed49dcc7e3c6542f0c95e81f46d070561af577146f41f5fd7bc7141" Oct 03 14:32:08 crc kubenswrapper[4962]: I1003 14:32:08.986838 4962 generic.go:334] "Generic (PLEG): container finished" podID="685d50a6-2ad8-4462-9526-193412684ac5" containerID="4a535b6e40b30fe8aa8fc68494fe164c796c0fd324e22f118497c50a0ba6ef10" exitCode=0 Oct 03 14:32:08 crc kubenswrapper[4962]: I1003 14:32:08.987370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" event={"ID":"685d50a6-2ad8-4462-9526-193412684ac5","Type":"ContainerDied","Data":"4a535b6e40b30fe8aa8fc68494fe164c796c0fd324e22f118497c50a0ba6ef10"} Oct 03 14:32:09 crc kubenswrapper[4962]: I1003 14:32:09.998826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" event={"ID":"685d50a6-2ad8-4462-9526-193412684ac5","Type":"ContainerStarted","Data":"a9b3b6d6d75b1c7e73e413b785a21d45f2cbd145b8c0d49a0dbae00231086f2a"} Oct 03 14:32:09 crc kubenswrapper[4962]: I1003 14:32:09.999446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" event={"ID":"685d50a6-2ad8-4462-9526-193412684ac5","Type":"ContainerStarted","Data":"c3437944fa6ba95b2daa85c7098db58b8b8d7f23aec7d90dc88fc7824357a4f0"} Oct 03 14:32:10 crc kubenswrapper[4962]: I1003 14:32:10.000612 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:32:10 crc kubenswrapper[4962]: I1003 14:32:10.000656 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:32:10 crc kubenswrapper[4962]: I1003 14:32:10.026300 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" podStartSLOduration=3.693338269 podStartE2EDuration="18.026274772s" podCreationTimestamp="2025-10-03 14:31:52 +0000 UTC" firstStartedPulling="2025-10-03 14:31:53.474781223 +0000 UTC m=+6121.878679058" lastFinishedPulling="2025-10-03 14:32:07.807717726 +0000 UTC m=+6136.211615561" observedRunningTime="2025-10-03 14:32:10.016525652 +0000 UTC m=+6138.420423487" watchObservedRunningTime="2025-10-03 14:32:10.026274772 +0000 UTC m=+6138.430172607" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.420970 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-flpcl"] Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.426172 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.429854 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.430088 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.430245 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.436840 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-flpcl"] Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.485026 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a90613f1-d29b-41eb-b925-f28918fbcd2b-hm-ports\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.485103 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a90613f1-d29b-41eb-b925-f28918fbcd2b-config-data-merged\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.485406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a90613f1-d29b-41eb-b925-f28918fbcd2b-config-data\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.485554 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a90613f1-d29b-41eb-b925-f28918fbcd2b-scripts\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.587592 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a90613f1-d29b-41eb-b925-f28918fbcd2b-config-data-merged\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.587718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a90613f1-d29b-41eb-b925-f28918fbcd2b-config-data\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.587768 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a90613f1-d29b-41eb-b925-f28918fbcd2b-scripts\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.587871 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a90613f1-d29b-41eb-b925-f28918fbcd2b-hm-ports\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.588648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a90613f1-d29b-41eb-b925-f28918fbcd2b-hm-ports\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.588883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a90613f1-d29b-41eb-b925-f28918fbcd2b-config-data-merged\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.594043 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a90613f1-d29b-41eb-b925-f28918fbcd2b-config-data\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.594830 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a90613f1-d29b-41eb-b925-f28918fbcd2b-scripts\") pod \"octavia-rsyslog-flpcl\" (UID: \"a90613f1-d29b-41eb-b925-f28918fbcd2b\") " pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:22 crc kubenswrapper[4962]: I1003 14:32:22.762383 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.317291 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-flpcl"] Oct 03 14:32:23 crc kubenswrapper[4962]: W1003 14:32:23.322435 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda90613f1_d29b_41eb_b925_f28918fbcd2b.slice/crio-6ff3bc6815b120c0330c8c3cccd840c1e84c48ac1723bf5015fac42c713055b9 WatchSource:0}: Error finding container 6ff3bc6815b120c0330c8c3cccd840c1e84c48ac1723bf5015fac42c713055b9: Status 404 returned error can't find the container with id 6ff3bc6815b120c0330c8c3cccd840c1e84c48ac1723bf5015fac42c713055b9 Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.435799 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-pg468"] Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.437484 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.440326 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.453623 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-pg468"] Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.504155 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/314fac3f-f260-4b98-bc77-541aa9bc29aa-amphora-image\") pod \"octavia-image-upload-59f8cff499-pg468\" (UID: \"314fac3f-f260-4b98-bc77-541aa9bc29aa\") " pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.504698 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/314fac3f-f260-4b98-bc77-541aa9bc29aa-httpd-config\") pod \"octavia-image-upload-59f8cff499-pg468\" (UID: \"314fac3f-f260-4b98-bc77-541aa9bc29aa\") " pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.606763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/314fac3f-f260-4b98-bc77-541aa9bc29aa-amphora-image\") pod \"octavia-image-upload-59f8cff499-pg468\" (UID: \"314fac3f-f260-4b98-bc77-541aa9bc29aa\") " pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.606867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/314fac3f-f260-4b98-bc77-541aa9bc29aa-httpd-config\") pod \"octavia-image-upload-59f8cff499-pg468\" (UID: \"314fac3f-f260-4b98-bc77-541aa9bc29aa\") " pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.607490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/314fac3f-f260-4b98-bc77-541aa9bc29aa-amphora-image\") pod \"octavia-image-upload-59f8cff499-pg468\" (UID: \"314fac3f-f260-4b98-bc77-541aa9bc29aa\") " pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.614818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/314fac3f-f260-4b98-bc77-541aa9bc29aa-httpd-config\") pod \"octavia-image-upload-59f8cff499-pg468\" (UID: \"314fac3f-f260-4b98-bc77-541aa9bc29aa\") " pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:32:23 crc kubenswrapper[4962]: I1003 14:32:23.764192 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:32:24 crc kubenswrapper[4962]: I1003 14:32:24.129401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-flpcl" event={"ID":"a90613f1-d29b-41eb-b925-f28918fbcd2b","Type":"ContainerStarted","Data":"6ff3bc6815b120c0330c8c3cccd840c1e84c48ac1723bf5015fac42c713055b9"} Oct 03 14:32:24 crc kubenswrapper[4962]: I1003 14:32:24.252300 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-pg468"] Oct 03 14:32:25 crc kubenswrapper[4962]: I1003 14:32:25.140795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-pg468" event={"ID":"314fac3f-f260-4b98-bc77-541aa9bc29aa","Type":"ContainerStarted","Data":"e1d2a19df1f73dc8204ac2c03859165f105665f5347645dfcec2546a25dda8c9"} Oct 03 14:32:27 crc kubenswrapper[4962]: I1003 14:32:27.163753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-flpcl" event={"ID":"a90613f1-d29b-41eb-b925-f28918fbcd2b","Type":"ContainerStarted","Data":"62ac2998ea25ea189bc364f873344c71eb4412ad5937716ef0c235c2915c5424"} Oct 03 14:32:27 crc kubenswrapper[4962]: I1003 14:32:27.179564 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:32:27 crc kubenswrapper[4962]: I1003 14:32:27.384253 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6f6fb44c8b-xbxhh" Oct 03 14:32:29 crc kubenswrapper[4962]: I1003 14:32:29.195495 4962 generic.go:334] "Generic (PLEG): container finished" podID="a90613f1-d29b-41eb-b925-f28918fbcd2b" containerID="62ac2998ea25ea189bc364f873344c71eb4412ad5937716ef0c235c2915c5424" exitCode=0 Oct 03 14:32:29 crc kubenswrapper[4962]: I1003 14:32:29.195575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-flpcl" event={"ID":"a90613f1-d29b-41eb-b925-f28918fbcd2b","Type":"ContainerDied","Data":"62ac2998ea25ea189bc364f873344c71eb4412ad5937716ef0c235c2915c5424"} Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.157603 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-lnxfr"] Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.160601 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.163162 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.168006 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-lnxfr"] Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.342202 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data-merged\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.343002 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.343121 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-combined-ca-bundle\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.343339 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-scripts\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.445107 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data-merged\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.445209 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.445257 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-combined-ca-bundle\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.445352 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-scripts\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.445771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data-merged\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.456935 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-combined-ca-bundle\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.457235 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.457814 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-scripts\") pod \"octavia-db-sync-lnxfr\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:30 crc kubenswrapper[4962]: I1003 14:32:30.492438 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:37 crc kubenswrapper[4962]: I1003 14:32:37.837820 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-lnxfr"] Oct 03 14:32:38 crc kubenswrapper[4962]: I1003 14:32:38.307901 4962 generic.go:334] "Generic (PLEG): container finished" podID="e54df0d5-846e-417d-bfc0-98804487ed5f" containerID="809ccc053df14594b353e17065f4067b6c0fff8a91f8e30f6137c0e7fc360d97" exitCode=0 Oct 03 14:32:38 crc kubenswrapper[4962]: I1003 14:32:38.308109 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lnxfr" event={"ID":"e54df0d5-846e-417d-bfc0-98804487ed5f","Type":"ContainerDied","Data":"809ccc053df14594b353e17065f4067b6c0fff8a91f8e30f6137c0e7fc360d97"} Oct 03 14:32:38 crc kubenswrapper[4962]: I1003 14:32:38.308559 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lnxfr" event={"ID":"e54df0d5-846e-417d-bfc0-98804487ed5f","Type":"ContainerStarted","Data":"79f4364e8d7b5760577aa4efdd52279c5a913262f5e1ac99ae21013f71351ed3"} Oct 03 14:32:38 crc kubenswrapper[4962]: I1003 14:32:38.312364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-flpcl" event={"ID":"a90613f1-d29b-41eb-b925-f28918fbcd2b","Type":"ContainerStarted","Data":"9e899d48973a54dc921c2544d7113e02ca7da3d3648ab5f7a8bd8825c8d4442f"} Oct 03 14:32:38 crc kubenswrapper[4962]: I1003 14:32:38.312581 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:38 crc kubenswrapper[4962]: I1003 14:32:38.324225 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-pg468" event={"ID":"314fac3f-f260-4b98-bc77-541aa9bc29aa","Type":"ContainerStarted","Data":"046c8c204c02d986151acee02368b74d4bf35235721095a664f2f4e87537a156"} Oct 03 14:32:38 crc kubenswrapper[4962]: I1003 14:32:38.373305 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-flpcl" podStartSLOduration=1.803219333 podStartE2EDuration="16.373277131s" podCreationTimestamp="2025-10-03 14:32:22 +0000 UTC" firstStartedPulling="2025-10-03 14:32:23.325789646 +0000 UTC m=+6151.729687481" lastFinishedPulling="2025-10-03 14:32:37.895847444 +0000 UTC m=+6166.299745279" observedRunningTime="2025-10-03 14:32:38.369261654 +0000 UTC m=+6166.773159509" watchObservedRunningTime="2025-10-03 14:32:38.373277131 +0000 UTC m=+6166.777174966" Oct 03 14:32:39 crc kubenswrapper[4962]: I1003 14:32:39.335521 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lnxfr" event={"ID":"e54df0d5-846e-417d-bfc0-98804487ed5f","Type":"ContainerStarted","Data":"b9ca7288d0995161794e1c0fcb55eb394c8071920f1ffc6215cf336ab9f77ad9"} Oct 03 14:32:39 crc kubenswrapper[4962]: I1003 14:32:39.337313 4962 generic.go:334] "Generic (PLEG): container finished" podID="314fac3f-f260-4b98-bc77-541aa9bc29aa" containerID="046c8c204c02d986151acee02368b74d4bf35235721095a664f2f4e87537a156" exitCode=0 Oct 03 14:32:39 crc kubenswrapper[4962]: I1003 14:32:39.337516 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-pg468" event={"ID":"314fac3f-f260-4b98-bc77-541aa9bc29aa","Type":"ContainerDied","Data":"046c8c204c02d986151acee02368b74d4bf35235721095a664f2f4e87537a156"} Oct 03 14:32:39 crc kubenswrapper[4962]: I1003 14:32:39.360697 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-lnxfr" podStartSLOduration=9.360671053 podStartE2EDuration="9.360671053s" podCreationTimestamp="2025-10-03 14:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:32:39.349134676 +0000 UTC m=+6167.753032511" watchObservedRunningTime="2025-10-03 14:32:39.360671053 +0000 UTC m=+6167.764568918" Oct 03 14:32:43 crc kubenswrapper[4962]: I1003 14:32:43.436314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-pg468" event={"ID":"314fac3f-f260-4b98-bc77-541aa9bc29aa","Type":"ContainerStarted","Data":"753a10965c68fd2019a1abc43a3f5cc81d75993d640d5ac2bfa4cef3b45ab21e"} Oct 03 14:32:43 crc kubenswrapper[4962]: I1003 14:32:43.465982 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-pg468" podStartSLOduration=2.159514916 podStartE2EDuration="20.465965904s" podCreationTimestamp="2025-10-03 14:32:23 +0000 UTC" firstStartedPulling="2025-10-03 14:32:24.254133382 +0000 UTC m=+6152.658031217" lastFinishedPulling="2025-10-03 14:32:42.56058435 +0000 UTC m=+6170.964482205" observedRunningTime="2025-10-03 14:32:43.464009592 +0000 UTC m=+6171.867907427" watchObservedRunningTime="2025-10-03 14:32:43.465965904 +0000 UTC m=+6171.869863739" Oct 03 14:32:44 crc kubenswrapper[4962]: I1003 14:32:44.449758 4962 generic.go:334] "Generic (PLEG): container finished" podID="e54df0d5-846e-417d-bfc0-98804487ed5f" containerID="b9ca7288d0995161794e1c0fcb55eb394c8071920f1ffc6215cf336ab9f77ad9" exitCode=0 Oct 03 14:32:44 crc kubenswrapper[4962]: I1003 14:32:44.449827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lnxfr" event={"ID":"e54df0d5-846e-417d-bfc0-98804487ed5f","Type":"ContainerDied","Data":"b9ca7288d0995161794e1c0fcb55eb394c8071920f1ffc6215cf336ab9f77ad9"} Oct 03 14:32:45 crc kubenswrapper[4962]: I1003 14:32:45.834328 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:45 crc kubenswrapper[4962]: I1003 14:32:45.977665 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data-merged\") pod \"e54df0d5-846e-417d-bfc0-98804487ed5f\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " Oct 03 14:32:45 crc kubenswrapper[4962]: I1003 14:32:45.977781 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data\") pod \"e54df0d5-846e-417d-bfc0-98804487ed5f\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " Oct 03 14:32:45 crc kubenswrapper[4962]: I1003 14:32:45.977804 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-scripts\") pod \"e54df0d5-846e-417d-bfc0-98804487ed5f\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " Oct 03 14:32:45 crc kubenswrapper[4962]: I1003 14:32:45.978005 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-combined-ca-bundle\") pod \"e54df0d5-846e-417d-bfc0-98804487ed5f\" (UID: \"e54df0d5-846e-417d-bfc0-98804487ed5f\") " Oct 03 14:32:45 crc kubenswrapper[4962]: I1003 14:32:45.996539 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data" (OuterVolumeSpecName: "config-data") pod "e54df0d5-846e-417d-bfc0-98804487ed5f" (UID: "e54df0d5-846e-417d-bfc0-98804487ed5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:32:45 crc kubenswrapper[4962]: I1003 14:32:45.996599 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-scripts" (OuterVolumeSpecName: "scripts") pod "e54df0d5-846e-417d-bfc0-98804487ed5f" (UID: "e54df0d5-846e-417d-bfc0-98804487ed5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.010021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "e54df0d5-846e-417d-bfc0-98804487ed5f" (UID: "e54df0d5-846e-417d-bfc0-98804487ed5f"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.011444 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e54df0d5-846e-417d-bfc0-98804487ed5f" (UID: "e54df0d5-846e-417d-bfc0-98804487ed5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.081030 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.081080 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.081094 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.081106 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54df0d5-846e-417d-bfc0-98804487ed5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.468790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-lnxfr" event={"ID":"e54df0d5-846e-417d-bfc0-98804487ed5f","Type":"ContainerDied","Data":"79f4364e8d7b5760577aa4efdd52279c5a913262f5e1ac99ae21013f71351ed3"} Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.469154 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79f4364e8d7b5760577aa4efdd52279c5a913262f5e1ac99ae21013f71351ed3" Oct 03 14:32:46 crc kubenswrapper[4962]: I1003 14:32:46.468858 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-lnxfr" Oct 03 14:32:47 crc kubenswrapper[4962]: I1003 14:32:47.054624 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-zxh9l"] Oct 03 14:32:47 crc kubenswrapper[4962]: I1003 14:32:47.063568 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-zxh9l"] Oct 03 14:32:48 crc kubenswrapper[4962]: I1003 14:32:48.255449 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f" path="/var/lib/kubelet/pods/2f87af9f-c72a-4c57-8d18-4e8a38eb9a9f/volumes" Oct 03 14:32:52 crc kubenswrapper[4962]: I1003 14:32:52.800585 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-flpcl" Oct 03 14:32:57 crc kubenswrapper[4962]: I1003 14:32:57.027071 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6456-account-create-6t7dd"] Oct 03 14:32:57 crc kubenswrapper[4962]: I1003 14:32:57.035527 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6456-account-create-6t7dd"] Oct 03 14:32:58 crc kubenswrapper[4962]: I1003 14:32:58.241865 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b17774ad-1948-4212-bf5a-84c3a1b4b771" path="/var/lib/kubelet/pods/b17774ad-1948-4212-bf5a-84c3a1b4b771/volumes" Oct 03 14:33:03 crc kubenswrapper[4962]: I1003 14:33:03.028729 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-t6pd5"] Oct 03 14:33:03 crc kubenswrapper[4962]: I1003 14:33:03.040354 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-t6pd5"] Oct 03 14:33:04 crc kubenswrapper[4962]: I1003 14:33:04.243856 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d71ba1f-f85a-4de7-8a9c-c86903b92300" path="/var/lib/kubelet/pods/3d71ba1f-f85a-4de7-8a9c-c86903b92300/volumes" Oct 03 14:33:07 crc kubenswrapper[4962]: I1003 14:33:07.804386 4962 scope.go:117] "RemoveContainer" containerID="176e2bc990a8132aa13de7a6f3100c02d4e785cf990da2661d22a4c7d11b34f0" Oct 03 14:33:07 crc kubenswrapper[4962]: I1003 14:33:07.847509 4962 scope.go:117] "RemoveContainer" containerID="82046b4c930316d9215712bf450a721d4933b2325ca212a1135f291351147f12" Oct 03 14:33:07 crc kubenswrapper[4962]: I1003 14:33:07.882781 4962 scope.go:117] "RemoveContainer" containerID="2150916b7e19a8427c2ec04ae45725eb127e65c44b56ebfe67b4e9eef7b9e6f3" Oct 03 14:33:08 crc kubenswrapper[4962]: I1003 14:33:08.419240 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-pg468"] Oct 03 14:33:08 crc kubenswrapper[4962]: I1003 14:33:08.419866 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-pg468" podUID="314fac3f-f260-4b98-bc77-541aa9bc29aa" containerName="octavia-amphora-httpd" containerID="cri-o://753a10965c68fd2019a1abc43a3f5cc81d75993d640d5ac2bfa4cef3b45ab21e" gracePeriod=30 Oct 03 14:33:08 crc kubenswrapper[4962]: I1003 14:33:08.672168 4962 generic.go:334] "Generic (PLEG): container finished" podID="314fac3f-f260-4b98-bc77-541aa9bc29aa" containerID="753a10965c68fd2019a1abc43a3f5cc81d75993d640d5ac2bfa4cef3b45ab21e" exitCode=0 Oct 03 14:33:08 crc kubenswrapper[4962]: I1003 14:33:08.672217 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-pg468" event={"ID":"314fac3f-f260-4b98-bc77-541aa9bc29aa","Type":"ContainerDied","Data":"753a10965c68fd2019a1abc43a3f5cc81d75993d640d5ac2bfa4cef3b45ab21e"} Oct 03 14:33:08 crc kubenswrapper[4962]: I1003 14:33:08.987080 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.134018 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/314fac3f-f260-4b98-bc77-541aa9bc29aa-httpd-config\") pod \"314fac3f-f260-4b98-bc77-541aa9bc29aa\" (UID: \"314fac3f-f260-4b98-bc77-541aa9bc29aa\") " Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.134323 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/314fac3f-f260-4b98-bc77-541aa9bc29aa-amphora-image\") pod \"314fac3f-f260-4b98-bc77-541aa9bc29aa\" (UID: \"314fac3f-f260-4b98-bc77-541aa9bc29aa\") " Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.170681 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314fac3f-f260-4b98-bc77-541aa9bc29aa-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "314fac3f-f260-4b98-bc77-541aa9bc29aa" (UID: "314fac3f-f260-4b98-bc77-541aa9bc29aa"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.231010 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/314fac3f-f260-4b98-bc77-541aa9bc29aa-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "314fac3f-f260-4b98-bc77-541aa9bc29aa" (UID: "314fac3f-f260-4b98-bc77-541aa9bc29aa"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.242707 4962 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/314fac3f-f260-4b98-bc77-541aa9bc29aa-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.243021 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/314fac3f-f260-4b98-bc77-541aa9bc29aa-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.686874 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-pg468" event={"ID":"314fac3f-f260-4b98-bc77-541aa9bc29aa","Type":"ContainerDied","Data":"e1d2a19df1f73dc8204ac2c03859165f105665f5347645dfcec2546a25dda8c9"} Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.686954 4962 scope.go:117] "RemoveContainer" containerID="753a10965c68fd2019a1abc43a3f5cc81d75993d640d5ac2bfa4cef3b45ab21e" Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.686979 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-pg468" Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.717910 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-pg468"] Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.719145 4962 scope.go:117] "RemoveContainer" containerID="046c8c204c02d986151acee02368b74d4bf35235721095a664f2f4e87537a156" Oct 03 14:33:09 crc kubenswrapper[4962]: I1003 14:33:09.734085 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-pg468"] Oct 03 14:33:10 crc kubenswrapper[4962]: I1003 14:33:10.252801 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314fac3f-f260-4b98-bc77-541aa9bc29aa" path="/var/lib/kubelet/pods/314fac3f-f260-4b98-bc77-541aa9bc29aa/volumes" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.826494 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-t4b68"] Oct 03 14:33:28 crc kubenswrapper[4962]: E1003 14:33:28.833537 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54df0d5-846e-417d-bfc0-98804487ed5f" containerName="octavia-db-sync" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.833837 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54df0d5-846e-417d-bfc0-98804487ed5f" containerName="octavia-db-sync" Oct 03 14:33:28 crc kubenswrapper[4962]: E1003 14:33:28.833944 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314fac3f-f260-4b98-bc77-541aa9bc29aa" containerName="octavia-amphora-httpd" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.834058 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="314fac3f-f260-4b98-bc77-541aa9bc29aa" containerName="octavia-amphora-httpd" Oct 03 14:33:28 crc kubenswrapper[4962]: E1003 14:33:28.834167 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54df0d5-846e-417d-bfc0-98804487ed5f" containerName="init" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.834253 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54df0d5-846e-417d-bfc0-98804487ed5f" containerName="init" Oct 03 14:33:28 crc kubenswrapper[4962]: E1003 14:33:28.834387 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314fac3f-f260-4b98-bc77-541aa9bc29aa" containerName="init" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.834460 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="314fac3f-f260-4b98-bc77-541aa9bc29aa" containerName="init" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.835430 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="314fac3f-f260-4b98-bc77-541aa9bc29aa" containerName="octavia-amphora-httpd" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.835555 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54df0d5-846e-417d-bfc0-98804487ed5f" containerName="octavia-db-sync" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.839792 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.848292 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-t4b68"] Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.853232 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.853571 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.853914 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.954609 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e1d69655-2430-4062-a2d8-da19a77ebd4a-config-data-merged\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.954728 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e1d69655-2430-4062-a2d8-da19a77ebd4a-hm-ports\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.954751 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-amphora-certs\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.954837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-combined-ca-bundle\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.954861 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-scripts\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:28 crc kubenswrapper[4962]: I1003 14:33:28.954880 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-config-data\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.056088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e1d69655-2430-4062-a2d8-da19a77ebd4a-hm-ports\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.056130 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-amphora-certs\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.056249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-combined-ca-bundle\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.056285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-scripts\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.056311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-config-data\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.056358 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e1d69655-2430-4062-a2d8-da19a77ebd4a-config-data-merged\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.057189 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e1d69655-2430-4062-a2d8-da19a77ebd4a-hm-ports\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.057344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e1d69655-2430-4062-a2d8-da19a77ebd4a-config-data-merged\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.061929 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-combined-ca-bundle\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.062237 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-config-data\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.062930 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-scripts\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.077671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e1d69655-2430-4062-a2d8-da19a77ebd4a-amphora-certs\") pod \"octavia-healthmanager-t4b68\" (UID: \"e1d69655-2430-4062-a2d8-da19a77ebd4a\") " pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.177031 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.755628 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-t4b68"] Oct 03 14:33:29 crc kubenswrapper[4962]: I1003 14:33:29.879715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-t4b68" event={"ID":"e1d69655-2430-4062-a2d8-da19a77ebd4a","Type":"ContainerStarted","Data":"73e4696bce0989bf6f9915629936c6b4d111236d5b5ee193d3dc17fc5c9118b3"} Oct 03 14:33:30 crc kubenswrapper[4962]: I1003 14:33:30.889687 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-t4b68" event={"ID":"e1d69655-2430-4062-a2d8-da19a77ebd4a","Type":"ContainerStarted","Data":"79ee3559a2f752dfb07bef85e848548c8857f05062c14dda4992743cd387dff7"} Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.288246 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-kkdhs"] Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.291342 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.293874 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.294086 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.299306 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-kkdhs"] Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.400846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc005e9c-8257-4def-8acb-ef953aa375c4-config-data-merged\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.400954 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-scripts\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.401022 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-combined-ca-bundle\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.401110 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-config-data\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.401192 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cc005e9c-8257-4def-8acb-ef953aa375c4-hm-ports\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.401597 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-amphora-certs\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.503439 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc005e9c-8257-4def-8acb-ef953aa375c4-config-data-merged\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.503492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-scripts\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.503524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-combined-ca-bundle\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.503556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-config-data\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.503612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cc005e9c-8257-4def-8acb-ef953aa375c4-hm-ports\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.503657 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-amphora-certs\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.504112 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc005e9c-8257-4def-8acb-ef953aa375c4-config-data-merged\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.505630 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cc005e9c-8257-4def-8acb-ef953aa375c4-hm-ports\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.510558 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-combined-ca-bundle\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.510783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-scripts\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.510998 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-amphora-certs\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.511061 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc005e9c-8257-4def-8acb-ef953aa375c4-config-data\") pod \"octavia-housekeeping-kkdhs\" (UID: \"cc005e9c-8257-4def-8acb-ef953aa375c4\") " pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.619312 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.902376 4962 generic.go:334] "Generic (PLEG): container finished" podID="e1d69655-2430-4062-a2d8-da19a77ebd4a" containerID="79ee3559a2f752dfb07bef85e848548c8857f05062c14dda4992743cd387dff7" exitCode=0 Oct 03 14:33:31 crc kubenswrapper[4962]: I1003 14:33:31.902502 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-t4b68" event={"ID":"e1d69655-2430-4062-a2d8-da19a77ebd4a","Type":"ContainerDied","Data":"79ee3559a2f752dfb07bef85e848548c8857f05062c14dda4992743cd387dff7"} Oct 03 14:33:32 crc kubenswrapper[4962]: I1003 14:33:32.187880 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-kkdhs"] Oct 03 14:33:32 crc kubenswrapper[4962]: I1003 14:33:32.915754 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-t4b68" event={"ID":"e1d69655-2430-4062-a2d8-da19a77ebd4a","Type":"ContainerStarted","Data":"7f76dba23820823c74ef6588a76312f738afeafbdb39ec69042b912a7834072c"} Oct 03 14:33:32 crc kubenswrapper[4962]: I1003 14:33:32.916258 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:32 crc kubenswrapper[4962]: I1003 14:33:32.917825 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-kkdhs" event={"ID":"cc005e9c-8257-4def-8acb-ef953aa375c4","Type":"ContainerStarted","Data":"f7931144b417b3c96fcc4ad416b24fcf0d03db9994154768532b3d479d0fe049"} Oct 03 14:33:32 crc kubenswrapper[4962]: I1003 14:33:32.937627 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-t4b68" podStartSLOduration=4.937593605 podStartE2EDuration="4.937593605s" podCreationTimestamp="2025-10-03 14:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:33:32.931166814 +0000 UTC m=+6221.335064659" watchObservedRunningTime="2025-10-03 14:33:32.937593605 +0000 UTC m=+6221.341491440" Oct 03 14:33:33 crc kubenswrapper[4962]: I1003 14:33:33.038735 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rbcsx"] Oct 03 14:33:33 crc kubenswrapper[4962]: I1003 14:33:33.048144 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rbcsx"] Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.092213 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-dk769"] Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.094411 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.096764 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.096948 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.102902 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-dk769"] Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.244031 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="089bba13-506d-4b03-8b69-46e49102539a" path="/var/lib/kubelet/pods/089bba13-506d-4b03-8b69-46e49102539a/volumes" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.260795 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-scripts\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.260846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/dec89ffb-653b-42d8-a160-fc87029742f7-hm-ports\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.260900 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-combined-ca-bundle\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.260929 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-config-data\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.261385 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-amphora-certs\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.261495 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dec89ffb-653b-42d8-a160-fc87029742f7-config-data-merged\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.362395 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/dec89ffb-653b-42d8-a160-fc87029742f7-hm-ports\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.362462 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-combined-ca-bundle\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.362495 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-config-data\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.362576 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-amphora-certs\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.362607 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dec89ffb-653b-42d8-a160-fc87029742f7-config-data-merged\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.362626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-scripts\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.363741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dec89ffb-653b-42d8-a160-fc87029742f7-config-data-merged\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.364811 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/dec89ffb-653b-42d8-a160-fc87029742f7-hm-ports\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.369213 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-scripts\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.369234 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-combined-ca-bundle\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.369865 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-amphora-certs\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.374300 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec89ffb-653b-42d8-a160-fc87029742f7-config-data\") pod \"octavia-worker-dk769\" (UID: \"dec89ffb-653b-42d8-a160-fc87029742f7\") " pod="openstack/octavia-worker-dk769" Oct 03 14:33:34 crc kubenswrapper[4962]: I1003 14:33:34.421075 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-dk769" Oct 03 14:33:35 crc kubenswrapper[4962]: I1003 14:33:35.246170 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-dk769"] Oct 03 14:33:35 crc kubenswrapper[4962]: W1003 14:33:35.258261 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec89ffb_653b_42d8_a160_fc87029742f7.slice/crio-d8f68765c771016deccfe91898dd36e0e392978a54fc1457325d8fb8d0738904 WatchSource:0}: Error finding container d8f68765c771016deccfe91898dd36e0e392978a54fc1457325d8fb8d0738904: Status 404 returned error can't find the container with id d8f68765c771016deccfe91898dd36e0e392978a54fc1457325d8fb8d0738904 Oct 03 14:33:35 crc kubenswrapper[4962]: I1003 14:33:35.944052 4962 generic.go:334] "Generic (PLEG): container finished" podID="cc005e9c-8257-4def-8acb-ef953aa375c4" containerID="65ce697e5edb3b33add6f4d70a7c0d9f7f9c396ebaeb273085a8369c7e18c86a" exitCode=0 Oct 03 14:33:35 crc kubenswrapper[4962]: I1003 14:33:35.944295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-kkdhs" event={"ID":"cc005e9c-8257-4def-8acb-ef953aa375c4","Type":"ContainerDied","Data":"65ce697e5edb3b33add6f4d70a7c0d9f7f9c396ebaeb273085a8369c7e18c86a"} Oct 03 14:33:35 crc kubenswrapper[4962]: I1003 14:33:35.946039 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dk769" event={"ID":"dec89ffb-653b-42d8-a160-fc87029742f7","Type":"ContainerStarted","Data":"d8f68765c771016deccfe91898dd36e0e392978a54fc1457325d8fb8d0738904"} Oct 03 14:33:36 crc kubenswrapper[4962]: I1003 14:33:36.957670 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-kkdhs" event={"ID":"cc005e9c-8257-4def-8acb-ef953aa375c4","Type":"ContainerStarted","Data":"7654290d3dd84d934f9fffef94417addab59efa0956d55a0e3f4795bb2b59562"} Oct 03 14:33:36 crc kubenswrapper[4962]: I1003 14:33:36.958235 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:36 crc kubenswrapper[4962]: I1003 14:33:36.981086 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-kkdhs" podStartSLOduration=3.422746917 podStartE2EDuration="5.981058957s" podCreationTimestamp="2025-10-03 14:33:31 +0000 UTC" firstStartedPulling="2025-10-03 14:33:32.198721924 +0000 UTC m=+6220.602619759" lastFinishedPulling="2025-10-03 14:33:34.757033954 +0000 UTC m=+6223.160931799" observedRunningTime="2025-10-03 14:33:36.980174764 +0000 UTC m=+6225.384072599" watchObservedRunningTime="2025-10-03 14:33:36.981058957 +0000 UTC m=+6225.384956792" Oct 03 14:33:38 crc kubenswrapper[4962]: I1003 14:33:38.981195 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dk769" event={"ID":"dec89ffb-653b-42d8-a160-fc87029742f7","Type":"ContainerStarted","Data":"c220d59f22a011d6b365b249cf60844a4861ab170061233bb395aa7539c4bfb0"} Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.618075 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-42pv7"] Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.621267 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.625214 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42pv7"] Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.720913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-catalog-content\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.720973 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgcjw\" (UniqueName: \"kubernetes.io/projected/96bd605f-e98f-4805-85a0-a928a7794ddd-kube-api-access-dgcjw\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.721282 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-utilities\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.823891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-catalog-content\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.823945 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgcjw\" (UniqueName: \"kubernetes.io/projected/96bd605f-e98f-4805-85a0-a928a7794ddd-kube-api-access-dgcjw\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.824052 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-utilities\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.824717 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-utilities\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.824943 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-catalog-content\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.847907 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgcjw\" (UniqueName: \"kubernetes.io/projected/96bd605f-e98f-4805-85a0-a928a7794ddd-kube-api-access-dgcjw\") pod \"certified-operators-42pv7\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:39 crc kubenswrapper[4962]: I1003 14:33:39.986485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:40 crc kubenswrapper[4962]: I1003 14:33:40.006943 4962 generic.go:334] "Generic (PLEG): container finished" podID="dec89ffb-653b-42d8-a160-fc87029742f7" containerID="c220d59f22a011d6b365b249cf60844a4861ab170061233bb395aa7539c4bfb0" exitCode=0 Oct 03 14:33:40 crc kubenswrapper[4962]: I1003 14:33:40.006981 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dk769" event={"ID":"dec89ffb-653b-42d8-a160-fc87029742f7","Type":"ContainerDied","Data":"c220d59f22a011d6b365b249cf60844a4861ab170061233bb395aa7539c4bfb0"} Oct 03 14:33:40 crc kubenswrapper[4962]: I1003 14:33:40.523554 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42pv7"] Oct 03 14:33:41 crc kubenswrapper[4962]: I1003 14:33:41.021007 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dk769" event={"ID":"dec89ffb-653b-42d8-a160-fc87029742f7","Type":"ContainerStarted","Data":"7f7bdffa48d47c403def53b950e410b7141d3bf6cc047954a3ed36c21e4eeafe"} Oct 03 14:33:41 crc kubenswrapper[4962]: I1003 14:33:41.032023 4962 generic.go:334] "Generic (PLEG): container finished" podID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerID="56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029" exitCode=0 Oct 03 14:33:41 crc kubenswrapper[4962]: I1003 14:33:41.032275 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pv7" event={"ID":"96bd605f-e98f-4805-85a0-a928a7794ddd","Type":"ContainerDied","Data":"56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029"} Oct 03 14:33:41 crc kubenswrapper[4962]: I1003 14:33:41.032801 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pv7" event={"ID":"96bd605f-e98f-4805-85a0-a928a7794ddd","Type":"ContainerStarted","Data":"e18138576eb4eb079bd1069e35c6d6adb5a2b8f03eb2e192a3ff04d2d4e4bc28"} Oct 03 14:33:41 crc kubenswrapper[4962]: I1003 14:33:41.047166 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-dk769" podStartSLOduration=4.415864075 podStartE2EDuration="7.047148662s" podCreationTimestamp="2025-10-03 14:33:34 +0000 UTC" firstStartedPulling="2025-10-03 14:33:35.262129259 +0000 UTC m=+6223.666027104" lastFinishedPulling="2025-10-03 14:33:37.893413856 +0000 UTC m=+6226.297311691" observedRunningTime="2025-10-03 14:33:41.042484728 +0000 UTC m=+6229.446382563" watchObservedRunningTime="2025-10-03 14:33:41.047148662 +0000 UTC m=+6229.451046497" Oct 03 14:33:42 crc kubenswrapper[4962]: I1003 14:33:42.049014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pv7" event={"ID":"96bd605f-e98f-4805-85a0-a928a7794ddd","Type":"ContainerStarted","Data":"ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b"} Oct 03 14:33:42 crc kubenswrapper[4962]: I1003 14:33:42.049394 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-dk769" Oct 03 14:33:43 crc kubenswrapper[4962]: I1003 14:33:43.041387 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1dbf-account-create-tdd5n"] Oct 03 14:33:43 crc kubenswrapper[4962]: I1003 14:33:43.050099 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1dbf-account-create-tdd5n"] Oct 03 14:33:43 crc kubenswrapper[4962]: I1003 14:33:43.060496 4962 generic.go:334] "Generic (PLEG): container finished" podID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerID="ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b" exitCode=0 Oct 03 14:33:43 crc kubenswrapper[4962]: I1003 14:33:43.062134 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pv7" event={"ID":"96bd605f-e98f-4805-85a0-a928a7794ddd","Type":"ContainerDied","Data":"ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b"} Oct 03 14:33:44 crc kubenswrapper[4962]: I1003 14:33:44.078483 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pv7" event={"ID":"96bd605f-e98f-4805-85a0-a928a7794ddd","Type":"ContainerStarted","Data":"546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d"} Oct 03 14:33:44 crc kubenswrapper[4962]: I1003 14:33:44.105500 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-42pv7" podStartSLOduration=2.586439169 podStartE2EDuration="5.105469181s" podCreationTimestamp="2025-10-03 14:33:39 +0000 UTC" firstStartedPulling="2025-10-03 14:33:41.03393421 +0000 UTC m=+6229.437832045" lastFinishedPulling="2025-10-03 14:33:43.552964212 +0000 UTC m=+6231.956862057" observedRunningTime="2025-10-03 14:33:44.097699824 +0000 UTC m=+6232.501597679" watchObservedRunningTime="2025-10-03 14:33:44.105469181 +0000 UTC m=+6232.509367036" Oct 03 14:33:44 crc kubenswrapper[4962]: I1003 14:33:44.215513 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-t4b68" Oct 03 14:33:44 crc kubenswrapper[4962]: I1003 14:33:44.256780 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fce07167-a5cd-4c88-b07a-6daedc6888ff" path="/var/lib/kubelet/pods/fce07167-a5cd-4c88-b07a-6daedc6888ff/volumes" Oct 03 14:33:46 crc kubenswrapper[4962]: I1003 14:33:46.652514 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-kkdhs" Oct 03 14:33:49 crc kubenswrapper[4962]: I1003 14:33:49.454698 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-dk769" Oct 03 14:33:49 crc kubenswrapper[4962]: I1003 14:33:49.987497 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:49 crc kubenswrapper[4962]: I1003 14:33:49.987819 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:50 crc kubenswrapper[4962]: I1003 14:33:50.038286 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:50 crc kubenswrapper[4962]: I1003 14:33:50.197206 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:50 crc kubenswrapper[4962]: I1003 14:33:50.277086 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42pv7"] Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.024907 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qcmsk"] Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.037692 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qcmsk"] Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.166378 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-42pv7" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerName="registry-server" containerID="cri-o://546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d" gracePeriod=2 Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.242834 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a69f85-585f-430d-bd92-311a41410a8b" path="/var/lib/kubelet/pods/44a69f85-585f-430d-bd92-311a41410a8b/volumes" Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.686077 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.819281 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgcjw\" (UniqueName: \"kubernetes.io/projected/96bd605f-e98f-4805-85a0-a928a7794ddd-kube-api-access-dgcjw\") pod \"96bd605f-e98f-4805-85a0-a928a7794ddd\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.819536 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-catalog-content\") pod \"96bd605f-e98f-4805-85a0-a928a7794ddd\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.819561 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-utilities\") pod \"96bd605f-e98f-4805-85a0-a928a7794ddd\" (UID: \"96bd605f-e98f-4805-85a0-a928a7794ddd\") " Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.820333 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-utilities" (OuterVolumeSpecName: "utilities") pod "96bd605f-e98f-4805-85a0-a928a7794ddd" (UID: "96bd605f-e98f-4805-85a0-a928a7794ddd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.827881 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96bd605f-e98f-4805-85a0-a928a7794ddd-kube-api-access-dgcjw" (OuterVolumeSpecName: "kube-api-access-dgcjw") pod "96bd605f-e98f-4805-85a0-a928a7794ddd" (UID: "96bd605f-e98f-4805-85a0-a928a7794ddd"). InnerVolumeSpecName "kube-api-access-dgcjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.879871 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96bd605f-e98f-4805-85a0-a928a7794ddd" (UID: "96bd605f-e98f-4805-85a0-a928a7794ddd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.921616 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.921666 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd605f-e98f-4805-85a0-a928a7794ddd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:33:52 crc kubenswrapper[4962]: I1003 14:33:52.921676 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgcjw\" (UniqueName: \"kubernetes.io/projected/96bd605f-e98f-4805-85a0-a928a7794ddd-kube-api-access-dgcjw\") on node \"crc\" DevicePath \"\"" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.176139 4962 generic.go:334] "Generic (PLEG): container finished" podID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerID="546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d" exitCode=0 Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.176181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pv7" event={"ID":"96bd605f-e98f-4805-85a0-a928a7794ddd","Type":"ContainerDied","Data":"546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d"} Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.176208 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pv7" event={"ID":"96bd605f-e98f-4805-85a0-a928a7794ddd","Type":"ContainerDied","Data":"e18138576eb4eb079bd1069e35c6d6adb5a2b8f03eb2e192a3ff04d2d4e4bc28"} Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.176248 4962 scope.go:117] "RemoveContainer" containerID="546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.176402 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pv7" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.222174 4962 scope.go:117] "RemoveContainer" containerID="ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.250149 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42pv7"] Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.256555 4962 scope.go:117] "RemoveContainer" containerID="56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.260962 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-42pv7"] Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.297346 4962 scope.go:117] "RemoveContainer" containerID="546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d" Oct 03 14:33:53 crc kubenswrapper[4962]: E1003 14:33:53.298370 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d\": container with ID starting with 546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d not found: ID does not exist" containerID="546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.298447 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d"} err="failed to get container status \"546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d\": rpc error: code = NotFound desc = could not find container \"546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d\": container with ID starting with 546cf25e20aeab6128922afa85b53f442617d4e74b5b442f1314d4219465a28d not found: ID does not exist" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.298507 4962 scope.go:117] "RemoveContainer" containerID="ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b" Oct 03 14:33:53 crc kubenswrapper[4962]: E1003 14:33:53.301699 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b\": container with ID starting with ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b not found: ID does not exist" containerID="ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.301762 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b"} err="failed to get container status \"ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b\": rpc error: code = NotFound desc = could not find container \"ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b\": container with ID starting with ee5eaa1acf039b36569a753d2365e0bfc73f57c236747251f15623decf15579b not found: ID does not exist" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.301806 4962 scope.go:117] "RemoveContainer" containerID="56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029" Oct 03 14:33:53 crc kubenswrapper[4962]: E1003 14:33:53.302187 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029\": container with ID starting with 56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029 not found: ID does not exist" containerID="56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029" Oct 03 14:33:53 crc kubenswrapper[4962]: I1003 14:33:53.302221 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029"} err="failed to get container status \"56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029\": rpc error: code = NotFound desc = could not find container \"56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029\": container with ID starting with 56068d622379d97dc32c4cde27d9c824b383380e6abe7629f8ce5fb164ca5029 not found: ID does not exist" Oct 03 14:33:54 crc kubenswrapper[4962]: I1003 14:33:54.243540 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" path="/var/lib/kubelet/pods/96bd605f-e98f-4805-85a0-a928a7794ddd/volumes" Oct 03 14:33:54 crc kubenswrapper[4962]: I1003 14:33:54.659898 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:33:54 crc kubenswrapper[4962]: I1003 14:33:54.659963 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:34:08 crc kubenswrapper[4962]: I1003 14:34:08.036389 4962 scope.go:117] "RemoveContainer" containerID="41f5a8c1e42028f6ccf01c15c51f81c2abaad0ace3913fef4c86fb929c802224" Oct 03 14:34:08 crc kubenswrapper[4962]: I1003 14:34:08.059791 4962 scope.go:117] "RemoveContainer" containerID="d2f4e3c9ce8ee87de1cc861cb2364de1ca8e2f046ac7918dd5152574b4f1de2b" Oct 03 14:34:08 crc kubenswrapper[4962]: I1003 14:34:08.090607 4962 scope.go:117] "RemoveContainer" containerID="b2a120ce016b4e1ff1c72b31cb479e33eecde2f14859b2b5317c090a4e6be976" Oct 03 14:34:08 crc kubenswrapper[4962]: I1003 14:34:08.125921 4962 scope.go:117] "RemoveContainer" containerID="97670cf30456e344ea9e34ccb387ae3ec34ac566de098185d45c066d9076cf5b" Oct 03 14:34:08 crc kubenswrapper[4962]: I1003 14:34:08.174505 4962 scope.go:117] "RemoveContainer" containerID="373228e79da2587786749685118a5a45e03f6a2255d81587683b66d99706a139" Oct 03 14:34:08 crc kubenswrapper[4962]: I1003 14:34:08.256036 4962 scope.go:117] "RemoveContainer" containerID="45f2d7888f216cb137768205351d3f09eecc33645aebd354d812696b2cb8502d" Oct 03 14:34:08 crc kubenswrapper[4962]: I1003 14:34:08.284162 4962 scope.go:117] "RemoveContainer" containerID="c776e40f624375deb15523775e764ffd7088a4fff687e80a560aa19c5722fbd4" Oct 03 14:34:24 crc kubenswrapper[4962]: I1003 14:34:24.659864 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:34:24 crc kubenswrapper[4962]: I1003 14:34:24.660738 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:34:34 crc kubenswrapper[4962]: I1003 14:34:34.044286 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-xr8j4"] Oct 03 14:34:34 crc kubenswrapper[4962]: I1003 14:34:34.053035 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-xr8j4"] Oct 03 14:34:34 crc kubenswrapper[4962]: I1003 14:34:34.244460 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35a75ca-500f-46c6-b933-02e980a88fe5" path="/var/lib/kubelet/pods/d35a75ca-500f-46c6-b933-02e980a88fe5/volumes" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.222849 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9cb97f545-qzjwt"] Oct 03 14:34:43 crc kubenswrapper[4962]: E1003 14:34:43.225873 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerName="extract-utilities" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.225900 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerName="extract-utilities" Oct 03 14:34:43 crc kubenswrapper[4962]: E1003 14:34:43.225936 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerName="registry-server" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.225947 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerName="registry-server" Oct 03 14:34:43 crc kubenswrapper[4962]: E1003 14:34:43.225973 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerName="extract-content" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.225987 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerName="extract-content" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.226441 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="96bd605f-e98f-4805-85a0-a928a7794ddd" containerName="registry-server" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.282316 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.284936 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-97qkq" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.287176 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.289073 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.293355 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9cb97f545-qzjwt"] Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.294018 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.314024 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.314255 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerName="glance-log" containerID="cri-o://9b7d13c58dc06972aa0112710d144a8e1c55f80e3094ec53d1dc34ed57b57252" gracePeriod=30 Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.314489 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerName="glance-httpd" containerID="cri-o://c9f09c18f69924294689c67ad8cd1e601b8e4deb075871b299c705dc3c27e121" gracePeriod=30 Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.360803 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f57db8649-rvfnl"] Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.363776 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.368961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m8pt\" (UniqueName: \"kubernetes.io/projected/48962e75-a3a8-46ef-8731-b3c692bb245e-kube-api-access-2m8pt\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.369079 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-config-data\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.369122 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48962e75-a3a8-46ef-8731-b3c692bb245e-logs\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.369212 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-scripts\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.369294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48962e75-a3a8-46ef-8731-b3c692bb245e-horizon-secret-key\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.398860 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f57db8649-rvfnl"] Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.454281 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.454870 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerName="glance-log" containerID="cri-o://01694b5bb759b57df173cde34bcd5520ef6b98929175f03f6ea30b50526f8ddb" gracePeriod=30 Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.455500 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerName="glance-httpd" containerID="cri-o://32a933e95b7bb998f66c1c9e494eb984df233d3076b47477db1578faec0062d0" gracePeriod=30 Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.471943 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-scripts\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48962e75-a3a8-46ef-8731-b3c692bb245e-horizon-secret-key\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472073 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m8pt\" (UniqueName: \"kubernetes.io/projected/48962e75-a3a8-46ef-8731-b3c692bb245e-kube-api-access-2m8pt\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472126 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89694890-d266-4cbb-be14-cda9a74927e4-horizon-secret-key\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472157 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-config-data\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472178 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwnck\" (UniqueName: \"kubernetes.io/projected/89694890-d266-4cbb-be14-cda9a74927e4-kube-api-access-rwnck\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48962e75-a3a8-46ef-8731-b3c692bb245e-logs\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472249 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89694890-d266-4cbb-be14-cda9a74927e4-logs\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472275 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-config-data\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.472300 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-scripts\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.474120 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-scripts\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.474153 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48962e75-a3a8-46ef-8731-b3c692bb245e-logs\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.474625 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-config-data\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.485391 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48962e75-a3a8-46ef-8731-b3c692bb245e-horizon-secret-key\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.495909 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m8pt\" (UniqueName: \"kubernetes.io/projected/48962e75-a3a8-46ef-8731-b3c692bb245e-kube-api-access-2m8pt\") pod \"horizon-9cb97f545-qzjwt\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.574472 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89694890-d266-4cbb-be14-cda9a74927e4-horizon-secret-key\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.574563 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwnck\" (UniqueName: \"kubernetes.io/projected/89694890-d266-4cbb-be14-cda9a74927e4-kube-api-access-rwnck\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.574674 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89694890-d266-4cbb-be14-cda9a74927e4-logs\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.574713 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-config-data\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.574755 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-scripts\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.575463 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89694890-d266-4cbb-be14-cda9a74927e4-logs\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.576527 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-scripts\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.576971 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-config-data\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.586156 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89694890-d266-4cbb-be14-cda9a74927e4-horizon-secret-key\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.591101 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwnck\" (UniqueName: \"kubernetes.io/projected/89694890-d266-4cbb-be14-cda9a74927e4-kube-api-access-rwnck\") pod \"horizon-6f57db8649-rvfnl\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.617654 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.662437 4962 generic.go:334] "Generic (PLEG): container finished" podID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerID="01694b5bb759b57df173cde34bcd5520ef6b98929175f03f6ea30b50526f8ddb" exitCode=143 Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.662513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab453a1b-a691-409f-8826-ef5286ea8efc","Type":"ContainerDied","Data":"01694b5bb759b57df173cde34bcd5520ef6b98929175f03f6ea30b50526f8ddb"} Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.664293 4962 generic.go:334] "Generic (PLEG): container finished" podID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerID="9b7d13c58dc06972aa0112710d144a8e1c55f80e3094ec53d1dc34ed57b57252" exitCode=143 Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.664335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c967c3d-1f67-42ff-9849-dcd585648bb8","Type":"ContainerDied","Data":"9b7d13c58dc06972aa0112710d144a8e1c55f80e3094ec53d1dc34ed57b57252"} Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.688018 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.803739 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f57db8649-rvfnl"] Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.850965 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5dcfc8787c-gf2hf"] Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.853910 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.889119 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dcfc8787c-gf2hf"] Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.981113 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-scripts\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.981196 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78dcd167-b38a-4da9-a9c6-63e48eed5832-logs\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.981234 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-config-data\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.981392 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78dcd167-b38a-4da9-a9c6-63e48eed5832-horizon-secret-key\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:43 crc kubenswrapper[4962]: I1003 14:34:43.981427 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxtm6\" (UniqueName: \"kubernetes.io/projected/78dcd167-b38a-4da9-a9c6-63e48eed5832-kube-api-access-xxtm6\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.046922 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4674-account-create-g99l4"] Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.056859 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4674-account-create-g99l4"] Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.083257 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78dcd167-b38a-4da9-a9c6-63e48eed5832-horizon-secret-key\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.083296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxtm6\" (UniqueName: \"kubernetes.io/projected/78dcd167-b38a-4da9-a9c6-63e48eed5832-kube-api-access-xxtm6\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.083341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-scripts\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.083377 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78dcd167-b38a-4da9-a9c6-63e48eed5832-logs\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.083397 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-config-data\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.084017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78dcd167-b38a-4da9-a9c6-63e48eed5832-logs\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.084357 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-scripts\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.084596 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-config-data\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.090035 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78dcd167-b38a-4da9-a9c6-63e48eed5832-horizon-secret-key\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.099280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxtm6\" (UniqueName: \"kubernetes.io/projected/78dcd167-b38a-4da9-a9c6-63e48eed5832-kube-api-access-xxtm6\") pod \"horizon-5dcfc8787c-gf2hf\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.146134 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9cb97f545-qzjwt"] Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.151112 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.194034 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.239455 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db99eb0e-0fce-44df-8c81-19936bc939e1" path="/var/lib/kubelet/pods/db99eb0e-0fce-44df-8c81-19936bc939e1/volumes" Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.261434 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f57db8649-rvfnl"] Oct 03 14:34:44 crc kubenswrapper[4962]: W1003 14:34:44.270529 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89694890_d266_4cbb_be14_cda9a74927e4.slice/crio-934a78c4b07730ee03da250e13bb9d2b447ddaae8c438129b57a58f1de953215 WatchSource:0}: Error finding container 934a78c4b07730ee03da250e13bb9d2b447ddaae8c438129b57a58f1de953215: Status 404 returned error can't find the container with id 934a78c4b07730ee03da250e13bb9d2b447ddaae8c438129b57a58f1de953215 Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.675309 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f57db8649-rvfnl" event={"ID":"89694890-d266-4cbb-be14-cda9a74927e4","Type":"ContainerStarted","Data":"934a78c4b07730ee03da250e13bb9d2b447ddaae8c438129b57a58f1de953215"} Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.676461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9cb97f545-qzjwt" event={"ID":"48962e75-a3a8-46ef-8731-b3c692bb245e","Type":"ContainerStarted","Data":"e33f027110beae95ea416ee90a6656513e966dc89a3080fc026cdbaf333a6dcd"} Oct 03 14:34:44 crc kubenswrapper[4962]: I1003 14:34:44.733629 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dcfc8787c-gf2hf"] Oct 03 14:34:44 crc kubenswrapper[4962]: W1003 14:34:44.737838 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78dcd167_b38a_4da9_a9c6_63e48eed5832.slice/crio-1e1b7547583f1c44cb91845d344552594e5cb0592d23527ae6a200ab1dedbad6 WatchSource:0}: Error finding container 1e1b7547583f1c44cb91845d344552594e5cb0592d23527ae6a200ab1dedbad6: Status 404 returned error can't find the container with id 1e1b7547583f1c44cb91845d344552594e5cb0592d23527ae6a200ab1dedbad6 Oct 03 14:34:45 crc kubenswrapper[4962]: I1003 14:34:45.686291 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dcfc8787c-gf2hf" event={"ID":"78dcd167-b38a-4da9-a9c6-63e48eed5832","Type":"ContainerStarted","Data":"1e1b7547583f1c44cb91845d344552594e5cb0592d23527ae6a200ab1dedbad6"} Oct 03 14:34:45 crc kubenswrapper[4962]: E1003 14:34:45.690654 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143444Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=267b225156f206caa61e7babe3bd4db348887a4c65a1015cd525b5261f2be640®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502984~hmac=0707cff1cfd463ad4fc754a7df26f8e4a62ed69a12b2bf4a7c579e3c135c86d7\": remote error: tls: internal error" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 14:34:45 crc kubenswrapper[4962]: E1003 14:34:45.690817 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n587h674h674h5d9h8bh6ch656hb5hf7h5bch8h5fdh685h5cfh75h684h68hc4h85h699h5cbhddh5b4h644h5fbh646hd9hf7h648h95h56hfcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m8pt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-9cb97f545-qzjwt_openstack(48962e75-a3a8-46ef-8731-b3c692bb245e): ErrImagePull: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143444Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=267b225156f206caa61e7babe3bd4db348887a4c65a1015cd525b5261f2be640®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502984~hmac=0707cff1cfd463ad4fc754a7df26f8e4a62ed69a12b2bf4a7c579e3c135c86d7\": remote error: tls: internal error" logger="UnhandledError" Oct 03 14:34:45 crc kubenswrapper[4962]: E1003 14:34:45.692663 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143444Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=267b225156f206caa61e7babe3bd4db348887a4c65a1015cd525b5261f2be640®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502984~hmac=0707cff1cfd463ad4fc754a7df26f8e4a62ed69a12b2bf4a7c579e3c135c86d7\\\": remote error: tls: internal error\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-9cb97f545-qzjwt" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" Oct 03 14:34:45 crc kubenswrapper[4962]: E1003 14:34:45.710749 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143444Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=267b225156f206caa61e7babe3bd4db348887a4c65a1015cd525b5261f2be640®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502984~hmac=0707cff1cfd463ad4fc754a7df26f8e4a62ed69a12b2bf4a7c579e3c135c86d7\": remote error: tls: internal error" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 14:34:45 crc kubenswrapper[4962]: E1003 14:34:45.710971 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf5h559hbch9h66h594h57h546h66ch568h55fh69h5d9hbh5fh7h87h579h67bh5b9h86h577h79h5f4h59bh565h4h54h94h8ch666h55bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwnck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6f57db8649-rvfnl_openstack(89694890-d266-4cbb-be14-cda9a74927e4): ErrImagePull: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143444Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=267b225156f206caa61e7babe3bd4db348887a4c65a1015cd525b5261f2be640®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502984~hmac=0707cff1cfd463ad4fc754a7df26f8e4a62ed69a12b2bf4a7c579e3c135c86d7\": remote error: tls: internal error" logger="UnhandledError" Oct 03 14:34:45 crc kubenswrapper[4962]: E1003 14:34:45.712983 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143444Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=267b225156f206caa61e7babe3bd4db348887a4c65a1015cd525b5261f2be640®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502984~hmac=0707cff1cfd463ad4fc754a7df26f8e4a62ed69a12b2bf4a7c579e3c135c86d7\\\": remote error: tls: internal error\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6f57db8649-rvfnl" podUID="89694890-d266-4cbb-be14-cda9a74927e4" Oct 03 14:34:46 crc kubenswrapper[4962]: E1003 14:34:46.208376 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143445Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=2924117f4fe1ddcd0f6e9e8167cf18dd308a7db8100c2e6613d5c58026c24855®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502985~hmac=8ddc3224887aa3eaccb7a70523f43b338f2babc34e5b5e1c2227146291a3c60d\": remote error: tls: internal error" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 14:34:46 crc kubenswrapper[4962]: E1003 14:34:46.209025 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch6ch58fh658h54ch67h5b6h5f5hb7h5ddh5d9h88h5f5hdbh688h55h58fh8ch58ch678hbdh5d7h68fh647h556h6bhc5h669h66bh694hb8h6cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xxtm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5dcfc8787c-gf2hf_openstack(78dcd167-b38a-4da9-a9c6-63e48eed5832): ErrImagePull: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143445Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=2924117f4fe1ddcd0f6e9e8167cf18dd308a7db8100c2e6613d5c58026c24855®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502985~hmac=8ddc3224887aa3eaccb7a70523f43b338f2babc34e5b5e1c2227146291a3c60d\": remote error: tls: internal error" logger="UnhandledError" Oct 03 14:34:46 crc kubenswrapper[4962]: E1003 14:34:46.211930 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/d4/d4c1bb03bed8e83081b02b71b7d408f3b40f437bb3096f5c73438f57b85bd170?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251003%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251003T143445Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=2924117f4fe1ddcd0f6e9e8167cf18dd308a7db8100c2e6613d5c58026c24855®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-horizon&akamai_signature=exp=1759502985~hmac=8ddc3224887aa3eaccb7a70523f43b338f2babc34e5b5e1c2227146291a3c60d\\\": remote error: tls: internal error\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5dcfc8787c-gf2hf" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" Oct 03 14:34:46 crc kubenswrapper[4962]: I1003 14:34:46.698098 4962 generic.go:334] "Generic (PLEG): container finished" podID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerID="32a933e95b7bb998f66c1c9e494eb984df233d3076b47477db1578faec0062d0" exitCode=0 Oct 03 14:34:46 crc kubenswrapper[4962]: I1003 14:34:46.698146 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab453a1b-a691-409f-8826-ef5286ea8efc","Type":"ContainerDied","Data":"32a933e95b7bb998f66c1c9e494eb984df233d3076b47477db1578faec0062d0"} Oct 03 14:34:46 crc kubenswrapper[4962]: I1003 14:34:46.701238 4962 generic.go:334] "Generic (PLEG): container finished" podID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerID="c9f09c18f69924294689c67ad8cd1e601b8e4deb075871b299c705dc3c27e121" exitCode=0 Oct 03 14:34:46 crc kubenswrapper[4962]: I1003 14:34:46.701352 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c967c3d-1f67-42ff-9849-dcd585648bb8","Type":"ContainerDied","Data":"c9f09c18f69924294689c67ad8cd1e601b8e4deb075871b299c705dc3c27e121"} Oct 03 14:34:46 crc kubenswrapper[4962]: E1003 14:34:46.705797 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5dcfc8787c-gf2hf" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" Oct 03 14:34:46 crc kubenswrapper[4962]: E1003 14:34:46.705835 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-9cb97f545-qzjwt" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" Oct 03 14:34:46 crc kubenswrapper[4962]: I1003 14:34:46.997918 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.041771 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-config-data\") pod \"2c967c3d-1f67-42ff-9849-dcd585648bb8\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.041935 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-httpd-run\") pod \"2c967c3d-1f67-42ff-9849-dcd585648bb8\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.042128 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-combined-ca-bundle\") pod \"2c967c3d-1f67-42ff-9849-dcd585648bb8\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.042150 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-ceph\") pod \"2c967c3d-1f67-42ff-9849-dcd585648bb8\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.042183 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-scripts\") pod \"2c967c3d-1f67-42ff-9849-dcd585648bb8\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.042204 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-logs\") pod \"2c967c3d-1f67-42ff-9849-dcd585648bb8\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.042281 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbzrt\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-kube-api-access-mbzrt\") pod \"2c967c3d-1f67-42ff-9849-dcd585648bb8\" (UID: \"2c967c3d-1f67-42ff-9849-dcd585648bb8\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.042628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2c967c3d-1f67-42ff-9849-dcd585648bb8" (UID: "2c967c3d-1f67-42ff-9849-dcd585648bb8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.043071 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-logs" (OuterVolumeSpecName: "logs") pod "2c967c3d-1f67-42ff-9849-dcd585648bb8" (UID: "2c967c3d-1f67-42ff-9849-dcd585648bb8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.043345 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.044272 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c967c3d-1f67-42ff-9849-dcd585648bb8-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.054314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-kube-api-access-mbzrt" (OuterVolumeSpecName: "kube-api-access-mbzrt") pod "2c967c3d-1f67-42ff-9849-dcd585648bb8" (UID: "2c967c3d-1f67-42ff-9849-dcd585648bb8"). InnerVolumeSpecName "kube-api-access-mbzrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.055171 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-ceph" (OuterVolumeSpecName: "ceph") pod "2c967c3d-1f67-42ff-9849-dcd585648bb8" (UID: "2c967c3d-1f67-42ff-9849-dcd585648bb8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.069581 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-scripts" (OuterVolumeSpecName: "scripts") pod "2c967c3d-1f67-42ff-9849-dcd585648bb8" (UID: "2c967c3d-1f67-42ff-9849-dcd585648bb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.094780 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c967c3d-1f67-42ff-9849-dcd585648bb8" (UID: "2c967c3d-1f67-42ff-9849-dcd585648bb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.134652 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.136168 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-config-data" (OuterVolumeSpecName: "config-data") pod "2c967c3d-1f67-42ff-9849-dcd585648bb8" (UID: "2c967c3d-1f67-42ff-9849-dcd585648bb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.146144 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.146170 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.146181 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.146192 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbzrt\" (UniqueName: \"kubernetes.io/projected/2c967c3d-1f67-42ff-9849-dcd585648bb8-kube-api-access-mbzrt\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.146201 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c967c3d-1f67-42ff-9849-dcd585648bb8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.182197 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-combined-ca-bundle\") pod \"ab453a1b-a691-409f-8826-ef5286ea8efc\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-logs\") pod \"ab453a1b-a691-409f-8826-ef5286ea8efc\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247338 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89694890-d266-4cbb-be14-cda9a74927e4-logs\") pod \"89694890-d266-4cbb-be14-cda9a74927e4\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247361 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwnck\" (UniqueName: \"kubernetes.io/projected/89694890-d266-4cbb-be14-cda9a74927e4-kube-api-access-rwnck\") pod \"89694890-d266-4cbb-be14-cda9a74927e4\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247406 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-ceph\") pod \"ab453a1b-a691-409f-8826-ef5286ea8efc\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247445 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-scripts\") pod \"89694890-d266-4cbb-be14-cda9a74927e4\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247490 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-config-data\") pod \"ab453a1b-a691-409f-8826-ef5286ea8efc\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-httpd-run\") pod \"ab453a1b-a691-409f-8826-ef5286ea8efc\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247582 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-config-data\") pod \"89694890-d266-4cbb-be14-cda9a74927e4\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247615 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvxj2\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-kube-api-access-lvxj2\") pod \"ab453a1b-a691-409f-8826-ef5286ea8efc\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247709 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89694890-d266-4cbb-be14-cda9a74927e4-horizon-secret-key\") pod \"89694890-d266-4cbb-be14-cda9a74927e4\" (UID: \"89694890-d266-4cbb-be14-cda9a74927e4\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.247755 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-scripts\") pod \"ab453a1b-a691-409f-8826-ef5286ea8efc\" (UID: \"ab453a1b-a691-409f-8826-ef5286ea8efc\") " Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.248189 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-logs" (OuterVolumeSpecName: "logs") pod "ab453a1b-a691-409f-8826-ef5286ea8efc" (UID: "ab453a1b-a691-409f-8826-ef5286ea8efc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.248434 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.249443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-config-data" (OuterVolumeSpecName: "config-data") pod "89694890-d266-4cbb-be14-cda9a74927e4" (UID: "89694890-d266-4cbb-be14-cda9a74927e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.250225 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89694890-d266-4cbb-be14-cda9a74927e4-logs" (OuterVolumeSpecName: "logs") pod "89694890-d266-4cbb-be14-cda9a74927e4" (UID: "89694890-d266-4cbb-be14-cda9a74927e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.250492 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-scripts" (OuterVolumeSpecName: "scripts") pod "89694890-d266-4cbb-be14-cda9a74927e4" (UID: "89694890-d266-4cbb-be14-cda9a74927e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.250818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ab453a1b-a691-409f-8826-ef5286ea8efc" (UID: "ab453a1b-a691-409f-8826-ef5286ea8efc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.252883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89694890-d266-4cbb-be14-cda9a74927e4-kube-api-access-rwnck" (OuterVolumeSpecName: "kube-api-access-rwnck") pod "89694890-d266-4cbb-be14-cda9a74927e4" (UID: "89694890-d266-4cbb-be14-cda9a74927e4"). InnerVolumeSpecName "kube-api-access-rwnck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.253018 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-ceph" (OuterVolumeSpecName: "ceph") pod "ab453a1b-a691-409f-8826-ef5286ea8efc" (UID: "ab453a1b-a691-409f-8826-ef5286ea8efc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.253252 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-kube-api-access-lvxj2" (OuterVolumeSpecName: "kube-api-access-lvxj2") pod "ab453a1b-a691-409f-8826-ef5286ea8efc" (UID: "ab453a1b-a691-409f-8826-ef5286ea8efc"). InnerVolumeSpecName "kube-api-access-lvxj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.254702 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-scripts" (OuterVolumeSpecName: "scripts") pod "ab453a1b-a691-409f-8826-ef5286ea8efc" (UID: "ab453a1b-a691-409f-8826-ef5286ea8efc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.255192 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89694890-d266-4cbb-be14-cda9a74927e4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "89694890-d266-4cbb-be14-cda9a74927e4" (UID: "89694890-d266-4cbb-be14-cda9a74927e4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.277527 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab453a1b-a691-409f-8826-ef5286ea8efc" (UID: "ab453a1b-a691-409f-8826-ef5286ea8efc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.299397 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-config-data" (OuterVolumeSpecName: "config-data") pod "ab453a1b-a691-409f-8826-ef5286ea8efc" (UID: "ab453a1b-a691-409f-8826-ef5286ea8efc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350446 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab453a1b-a691-409f-8826-ef5286ea8efc-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350474 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350484 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvxj2\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-kube-api-access-lvxj2\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350492 4962 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89694890-d266-4cbb-be14-cda9a74927e4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350500 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350507 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350515 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwnck\" (UniqueName: \"kubernetes.io/projected/89694890-d266-4cbb-be14-cda9a74927e4-kube-api-access-rwnck\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350523 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89694890-d266-4cbb-be14-cda9a74927e4-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350532 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ab453a1b-a691-409f-8826-ef5286ea8efc-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350540 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89694890-d266-4cbb-be14-cda9a74927e4-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.350547 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab453a1b-a691-409f-8826-ef5286ea8efc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.714964 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab453a1b-a691-409f-8826-ef5286ea8efc","Type":"ContainerDied","Data":"7d1e0c4aedb956cc03f296b66d1f870ddfc8e8e039a92729d225484c5b348c64"} Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.715266 4962 scope.go:117] "RemoveContainer" containerID="32a933e95b7bb998f66c1c9e494eb984df233d3076b47477db1578faec0062d0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.715023 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.716838 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f57db8649-rvfnl" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.716880 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f57db8649-rvfnl" event={"ID":"89694890-d266-4cbb-be14-cda9a74927e4","Type":"ContainerDied","Data":"934a78c4b07730ee03da250e13bb9d2b447ddaae8c438129b57a58f1de953215"} Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.719407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c967c3d-1f67-42ff-9849-dcd585648bb8","Type":"ContainerDied","Data":"5fd722d43f15ab83b3a8c49abe9530a5c27ee768fef9cc6cd388242ac41383b2"} Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.719522 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.739181 4962 scope.go:117] "RemoveContainer" containerID="01694b5bb759b57df173cde34bcd5520ef6b98929175f03f6ea30b50526f8ddb" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.759143 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.768572 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.779986 4962 scope.go:117] "RemoveContainer" containerID="c9f09c18f69924294689c67ad8cd1e601b8e4deb075871b299c705dc3c27e121" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.812896 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:34:47 crc kubenswrapper[4962]: E1003 14:34:47.813414 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerName="glance-log" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.813427 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerName="glance-log" Oct 03 14:34:47 crc kubenswrapper[4962]: E1003 14:34:47.813458 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerName="glance-httpd" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.813464 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerName="glance-httpd" Oct 03 14:34:47 crc kubenswrapper[4962]: E1003 14:34:47.813474 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerName="glance-httpd" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.813481 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerName="glance-httpd" Oct 03 14:34:47 crc kubenswrapper[4962]: E1003 14:34:47.813497 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerName="glance-log" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.813503 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerName="glance-log" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.813798 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerName="glance-httpd" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.813812 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerName="glance-log" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.813828 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" containerName="glance-httpd" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.813841 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" containerName="glance-log" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.815026 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.819232 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.819456 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.821831 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4pr5m" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.824298 4962 scope.go:117] "RemoveContainer" containerID="9b7d13c58dc06972aa0112710d144a8e1c55f80e3094ec53d1dc34ed57b57252" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.848057 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f57db8649-rvfnl"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.856654 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f57db8649-rvfnl"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.863308 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1afb0a2b-e861-4676-a5a6-f762c65ac044-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.863406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1afb0a2b-e861-4676-a5a6-f762c65ac044-logs\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.863547 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.863621 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1afb0a2b-e861-4676-a5a6-f762c65ac044-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.863680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.863754 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txg6h\" (UniqueName: \"kubernetes.io/projected/1afb0a2b-e861-4676-a5a6-f762c65ac044-kube-api-access-txg6h\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.863826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.864993 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.874623 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.882623 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.890249 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.892295 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.895872 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.897912 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.965881 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43335121-1384-48d5-b2e8-ba845364de87-logs\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966017 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1afb0a2b-e861-4676-a5a6-f762c65ac044-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966076 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966158 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txg6h\" (UniqueName: \"kubernetes.io/projected/1afb0a2b-e861-4676-a5a6-f762c65ac044-kube-api-access-txg6h\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966275 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/43335121-1384-48d5-b2e8-ba845364de87-ceph\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966329 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43335121-1384-48d5-b2e8-ba845364de87-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966353 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1afb0a2b-e861-4676-a5a6-f762c65ac044-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1afb0a2b-e861-4676-a5a6-f762c65ac044-logs\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966429 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-config-data\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966478 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966510 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-scripts\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966588 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fvqb\" (UniqueName: \"kubernetes.io/projected/43335121-1384-48d5-b2e8-ba845364de87-kube-api-access-5fvqb\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.966660 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.967129 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1afb0a2b-e861-4676-a5a6-f762c65ac044-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.967257 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1afb0a2b-e861-4676-a5a6-f762c65ac044-logs\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.971052 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.971865 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.976227 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1afb0a2b-e861-4676-a5a6-f762c65ac044-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.977188 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1afb0a2b-e861-4676-a5a6-f762c65ac044-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:47 crc kubenswrapper[4962]: I1003 14:34:47.984022 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txg6h\" (UniqueName: \"kubernetes.io/projected/1afb0a2b-e861-4676-a5a6-f762c65ac044-kube-api-access-txg6h\") pod \"glance-default-internal-api-0\" (UID: \"1afb0a2b-e861-4676-a5a6-f762c65ac044\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.068840 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-config-data\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.068895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.068918 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-scripts\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.068973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fvqb\" (UniqueName: \"kubernetes.io/projected/43335121-1384-48d5-b2e8-ba845364de87-kube-api-access-5fvqb\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.069011 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43335121-1384-48d5-b2e8-ba845364de87-logs\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.069092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/43335121-1384-48d5-b2e8-ba845364de87-ceph\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.069114 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43335121-1384-48d5-b2e8-ba845364de87-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.069575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43335121-1384-48d5-b2e8-ba845364de87-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.069870 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43335121-1384-48d5-b2e8-ba845364de87-logs\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.073755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-scripts\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.073895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/43335121-1384-48d5-b2e8-ba845364de87-ceph\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.074448 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.074610 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43335121-1384-48d5-b2e8-ba845364de87-config-data\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.087807 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fvqb\" (UniqueName: \"kubernetes.io/projected/43335121-1384-48d5-b2e8-ba845364de87-kube-api-access-5fvqb\") pod \"glance-default-external-api-0\" (UID: \"43335121-1384-48d5-b2e8-ba845364de87\") " pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.145214 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.216050 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.255815 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c967c3d-1f67-42ff-9849-dcd585648bb8" path="/var/lib/kubelet/pods/2c967c3d-1f67-42ff-9849-dcd585648bb8/volumes" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.263590 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89694890-d266-4cbb-be14-cda9a74927e4" path="/var/lib/kubelet/pods/89694890-d266-4cbb-be14-cda9a74927e4/volumes" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.264152 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab453a1b-a691-409f-8826-ef5286ea8efc" path="/var/lib/kubelet/pods/ab453a1b-a691-409f-8826-ef5286ea8efc/volumes" Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.683626 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:34:48 crc kubenswrapper[4962]: W1003 14:34:48.715725 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1afb0a2b_e861_4676_a5a6_f762c65ac044.slice/crio-0ec1974646dbd10593b0f869070698e5890a3b169668de98e8de585dde236b37 WatchSource:0}: Error finding container 0ec1974646dbd10593b0f869070698e5890a3b169668de98e8de585dde236b37: Status 404 returned error can't find the container with id 0ec1974646dbd10593b0f869070698e5890a3b169668de98e8de585dde236b37 Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.748723 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1afb0a2b-e861-4676-a5a6-f762c65ac044","Type":"ContainerStarted","Data":"0ec1974646dbd10593b0f869070698e5890a3b169668de98e8de585dde236b37"} Oct 03 14:34:48 crc kubenswrapper[4962]: W1003 14:34:48.856693 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43335121_1384_48d5_b2e8_ba845364de87.slice/crio-9f6d61ae3862110743c27ef871337dca804726b2cc5e5beafb9fdb0150cfd850 WatchSource:0}: Error finding container 9f6d61ae3862110743c27ef871337dca804726b2cc5e5beafb9fdb0150cfd850: Status 404 returned error can't find the container with id 9f6d61ae3862110743c27ef871337dca804726b2cc5e5beafb9fdb0150cfd850 Oct 03 14:34:48 crc kubenswrapper[4962]: I1003 14:34:48.870143 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:34:49 crc kubenswrapper[4962]: I1003 14:34:49.768821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43335121-1384-48d5-b2e8-ba845364de87","Type":"ContainerStarted","Data":"64c76eb6b2d94fbaa801a5c2ea123804e9a4c7b16a5a70f9e74dd77459c73d48"} Oct 03 14:34:49 crc kubenswrapper[4962]: I1003 14:34:49.769811 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43335121-1384-48d5-b2e8-ba845364de87","Type":"ContainerStarted","Data":"9f6d61ae3862110743c27ef871337dca804726b2cc5e5beafb9fdb0150cfd850"} Oct 03 14:34:49 crc kubenswrapper[4962]: I1003 14:34:49.778125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1afb0a2b-e861-4676-a5a6-f762c65ac044","Type":"ContainerStarted","Data":"89497839b1d00062917549944ccda3655581e83a92d71e81bbb625140a56ae1d"} Oct 03 14:34:50 crc kubenswrapper[4962]: I1003 14:34:50.790510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1afb0a2b-e861-4676-a5a6-f762c65ac044","Type":"ContainerStarted","Data":"57fb3256fca6b4ce365be3ba84f83250ac656daf47a187d9233e5b5369c8b681"} Oct 03 14:34:50 crc kubenswrapper[4962]: I1003 14:34:50.792537 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43335121-1384-48d5-b2e8-ba845364de87","Type":"ContainerStarted","Data":"722c30fff7c4bab55ae8f40820b4ef749d64f75d8fe6caf45c44a0fefd98e6f4"} Oct 03 14:34:50 crc kubenswrapper[4962]: I1003 14:34:50.815531 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.815508238 podStartE2EDuration="3.815508238s" podCreationTimestamp="2025-10-03 14:34:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:34:50.80847699 +0000 UTC m=+6299.212374845" watchObservedRunningTime="2025-10-03 14:34:50.815508238 +0000 UTC m=+6299.219406073" Oct 03 14:34:50 crc kubenswrapper[4962]: I1003 14:34:50.839159 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.839141368 podStartE2EDuration="3.839141368s" podCreationTimestamp="2025-10-03 14:34:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:34:50.829678096 +0000 UTC m=+6299.233575931" watchObservedRunningTime="2025-10-03 14:34:50.839141368 +0000 UTC m=+6299.243039193" Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.029676 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lblt6"] Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.040109 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lblt6"] Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.243234 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6b2f6a-d9bd-456a-9842-b3975ea2a778" path="/var/lib/kubelet/pods/fa6b2f6a-d9bd-456a-9842-b3975ea2a778/volumes" Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.660382 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.660441 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.660493 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.661312 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"075a3a2d68fc05b9db35c066365380f2ff374a7b6a1faec1634013ff945a759f"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.661386 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://075a3a2d68fc05b9db35c066365380f2ff374a7b6a1faec1634013ff945a759f" gracePeriod=600 Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.830754 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="075a3a2d68fc05b9db35c066365380f2ff374a7b6a1faec1634013ff945a759f" exitCode=0 Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.830878 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"075a3a2d68fc05b9db35c066365380f2ff374a7b6a1faec1634013ff945a759f"} Oct 03 14:34:54 crc kubenswrapper[4962]: I1003 14:34:54.831173 4962 scope.go:117] "RemoveContainer" containerID="d7dae192729a16d2e0ed7a375c7ee4990d673d8d62f560c5335516db0b1730db" Oct 03 14:34:55 crc kubenswrapper[4962]: I1003 14:34:55.847321 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256"} Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.145978 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.147500 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.176238 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.187075 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.219851 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.219987 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.265223 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.267190 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.875148 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.875240 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.875258 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 14:34:58 crc kubenswrapper[4962]: I1003 14:34:58.875270 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 14:35:00 crc kubenswrapper[4962]: I1003 14:35:00.905930 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 14:35:00 crc kubenswrapper[4962]: I1003 14:35:00.906740 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:35:00 crc kubenswrapper[4962]: I1003 14:35:00.917842 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 14:35:00 crc kubenswrapper[4962]: I1003 14:35:00.933041 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 14:35:00 crc kubenswrapper[4962]: I1003 14:35:00.933146 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:35:00 crc kubenswrapper[4962]: I1003 14:35:00.933325 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 14:35:05 crc kubenswrapper[4962]: I1003 14:35:05.949280 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dcfc8787c-gf2hf" event={"ID":"78dcd167-b38a-4da9-a9c6-63e48eed5832","Type":"ContainerStarted","Data":"90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a"} Oct 03 14:35:05 crc kubenswrapper[4962]: I1003 14:35:05.949809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dcfc8787c-gf2hf" event={"ID":"78dcd167-b38a-4da9-a9c6-63e48eed5832","Type":"ContainerStarted","Data":"e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa"} Oct 03 14:35:05 crc kubenswrapper[4962]: I1003 14:35:05.953528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9cb97f545-qzjwt" event={"ID":"48962e75-a3a8-46ef-8731-b3c692bb245e","Type":"ContainerStarted","Data":"b3691b848384dcc992ed93a916c86b6f4c0f0b9f9b2a4d59ceb5a10f391e953f"} Oct 03 14:35:05 crc kubenswrapper[4962]: I1003 14:35:05.953568 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9cb97f545-qzjwt" event={"ID":"48962e75-a3a8-46ef-8731-b3c692bb245e","Type":"ContainerStarted","Data":"d9052a8ca693c11478efee47005036d820709cc1af52a6696c7e0795d2079cc7"} Oct 03 14:35:06 crc kubenswrapper[4962]: I1003 14:35:06.002214 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9cb97f545-qzjwt" podStartSLOduration=2.180451837 podStartE2EDuration="23.002193496s" podCreationTimestamp="2025-10-03 14:34:43 +0000 UTC" firstStartedPulling="2025-10-03 14:34:44.150793747 +0000 UTC m=+6292.554691582" lastFinishedPulling="2025-10-03 14:35:04.972535386 +0000 UTC m=+6313.376433241" observedRunningTime="2025-10-03 14:35:05.996536725 +0000 UTC m=+6314.400434560" watchObservedRunningTime="2025-10-03 14:35:06.002193496 +0000 UTC m=+6314.406091331" Oct 03 14:35:06 crc kubenswrapper[4962]: I1003 14:35:06.006973 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5dcfc8787c-gf2hf" podStartSLOduration=2.746947299 podStartE2EDuration="23.006963103s" podCreationTimestamp="2025-10-03 14:34:43 +0000 UTC" firstStartedPulling="2025-10-03 14:34:44.740803917 +0000 UTC m=+6293.144701752" lastFinishedPulling="2025-10-03 14:35:05.000819711 +0000 UTC m=+6313.404717556" observedRunningTime="2025-10-03 14:35:05.980101986 +0000 UTC m=+6314.383999831" watchObservedRunningTime="2025-10-03 14:35:06.006963103 +0000 UTC m=+6314.410860938" Oct 03 14:35:08 crc kubenswrapper[4962]: I1003 14:35:08.484091 4962 scope.go:117] "RemoveContainer" containerID="69643303cefdd941ae859ff917a25e89598c7a5baa3be5eb0c60c200b92e8c4d" Oct 03 14:35:08 crc kubenswrapper[4962]: I1003 14:35:08.507123 4962 scope.go:117] "RemoveContainer" containerID="9e9853734eb445883ad10f8e551909f5dd2d34ba9ed862ce3c6bca2cc1514f1b" Oct 03 14:35:08 crc kubenswrapper[4962]: I1003 14:35:08.569225 4962 scope.go:117] "RemoveContainer" containerID="2f5b5a163b992f0535467c25207302cf628d40694517c4508c3788dbf65d42d9" Oct 03 14:35:13 crc kubenswrapper[4962]: I1003 14:35:13.618197 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:35:13 crc kubenswrapper[4962]: I1003 14:35:13.619718 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:35:14 crc kubenswrapper[4962]: I1003 14:35:14.194473 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:35:14 crc kubenswrapper[4962]: I1003 14:35:14.195008 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:35:23 crc kubenswrapper[4962]: I1003 14:35:23.620445 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9cb97f545-qzjwt" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Oct 03 14:35:24 crc kubenswrapper[4962]: I1003 14:35:24.063060 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-mlb24"] Oct 03 14:35:24 crc kubenswrapper[4962]: I1003 14:35:24.071126 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-mlb24"] Oct 03 14:35:24 crc kubenswrapper[4962]: I1003 14:35:24.198154 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5dcfc8787c-gf2hf" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Oct 03 14:35:24 crc kubenswrapper[4962]: I1003 14:35:24.246394 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e3b8ae-be38-4566-914b-dfc9776e2218" path="/var/lib/kubelet/pods/17e3b8ae-be38-4566-914b-dfc9776e2218/volumes" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.582239 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g258p"] Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.585262 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.595017 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g258p"] Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.749576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-utilities\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.749677 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-catalog-content\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.749721 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzd4\" (UniqueName: \"kubernetes.io/projected/8695b814-fe22-4797-b53e-b0a1a25b9afe-kube-api-access-ltzd4\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.851475 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-utilities\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.851819 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-catalog-content\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.851868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzd4\" (UniqueName: \"kubernetes.io/projected/8695b814-fe22-4797-b53e-b0a1a25b9afe-kube-api-access-ltzd4\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.852070 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-utilities\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.852199 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-catalog-content\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.875386 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzd4\" (UniqueName: \"kubernetes.io/projected/8695b814-fe22-4797-b53e-b0a1a25b9afe-kube-api-access-ltzd4\") pod \"community-operators-g258p\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:29 crc kubenswrapper[4962]: I1003 14:35:29.913769 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:30 crc kubenswrapper[4962]: I1003 14:35:30.548180 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g258p"] Oct 03 14:35:31 crc kubenswrapper[4962]: I1003 14:35:31.212891 4962 generic.go:334] "Generic (PLEG): container finished" podID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerID="88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6" exitCode=0 Oct 03 14:35:31 crc kubenswrapper[4962]: I1003 14:35:31.212939 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g258p" event={"ID":"8695b814-fe22-4797-b53e-b0a1a25b9afe","Type":"ContainerDied","Data":"88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6"} Oct 03 14:35:31 crc kubenswrapper[4962]: I1003 14:35:31.212965 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g258p" event={"ID":"8695b814-fe22-4797-b53e-b0a1a25b9afe","Type":"ContainerStarted","Data":"b2bc4af6de1c99ed5312dacb48f2ba442a3f553588223962cc8ae215e8036cd9"} Oct 03 14:35:32 crc kubenswrapper[4962]: I1003 14:35:32.239255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g258p" event={"ID":"8695b814-fe22-4797-b53e-b0a1a25b9afe","Type":"ContainerStarted","Data":"d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774"} Oct 03 14:35:33 crc kubenswrapper[4962]: I1003 14:35:33.250876 4962 generic.go:334] "Generic (PLEG): container finished" podID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerID="d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774" exitCode=0 Oct 03 14:35:33 crc kubenswrapper[4962]: I1003 14:35:33.250937 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g258p" event={"ID":"8695b814-fe22-4797-b53e-b0a1a25b9afe","Type":"ContainerDied","Data":"d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774"} Oct 03 14:35:34 crc kubenswrapper[4962]: I1003 14:35:34.027821 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ec53-account-create-q5vph"] Oct 03 14:35:34 crc kubenswrapper[4962]: I1003 14:35:34.038835 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ec53-account-create-q5vph"] Oct 03 14:35:34 crc kubenswrapper[4962]: I1003 14:35:34.250936 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec444c0-1410-4946-908d-60384d2b766f" path="/var/lib/kubelet/pods/8ec444c0-1410-4946-908d-60384d2b766f/volumes" Oct 03 14:35:34 crc kubenswrapper[4962]: I1003 14:35:34.261055 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g258p" event={"ID":"8695b814-fe22-4797-b53e-b0a1a25b9afe","Type":"ContainerStarted","Data":"901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282"} Oct 03 14:35:35 crc kubenswrapper[4962]: I1003 14:35:35.401023 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:35:35 crc kubenswrapper[4962]: I1003 14:35:35.429896 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g258p" podStartSLOduration=4.00410279 podStartE2EDuration="6.429874905s" podCreationTimestamp="2025-10-03 14:35:29 +0000 UTC" firstStartedPulling="2025-10-03 14:35:31.215294309 +0000 UTC m=+6339.619192134" lastFinishedPulling="2025-10-03 14:35:33.641066414 +0000 UTC m=+6342.044964249" observedRunningTime="2025-10-03 14:35:34.281847338 +0000 UTC m=+6342.685745173" watchObservedRunningTime="2025-10-03 14:35:35.429874905 +0000 UTC m=+6343.833772740" Oct 03 14:35:36 crc kubenswrapper[4962]: I1003 14:35:36.025259 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:35:37 crc kubenswrapper[4962]: I1003 14:35:37.323390 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:35:37 crc kubenswrapper[4962]: I1003 14:35:37.790797 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:35:37 crc kubenswrapper[4962]: I1003 14:35:37.867516 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9cb97f545-qzjwt"] Oct 03 14:35:38 crc kubenswrapper[4962]: I1003 14:35:38.307509 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9cb97f545-qzjwt" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon-log" containerID="cri-o://d9052a8ca693c11478efee47005036d820709cc1af52a6696c7e0795d2079cc7" gracePeriod=30 Oct 03 14:35:38 crc kubenswrapper[4962]: I1003 14:35:38.307586 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9cb97f545-qzjwt" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon" containerID="cri-o://b3691b848384dcc992ed93a916c86b6f4c0f0b9f9b2a4d59ceb5a10f391e953f" gracePeriod=30 Oct 03 14:35:39 crc kubenswrapper[4962]: I1003 14:35:39.914317 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:39 crc kubenswrapper[4962]: I1003 14:35:39.914768 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:39 crc kubenswrapper[4962]: I1003 14:35:39.967191 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:40 crc kubenswrapper[4962]: I1003 14:35:40.368207 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:40 crc kubenswrapper[4962]: I1003 14:35:40.417387 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g258p"] Oct 03 14:35:41 crc kubenswrapper[4962]: I1003 14:35:41.063498 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2fjvl"] Oct 03 14:35:41 crc kubenswrapper[4962]: I1003 14:35:41.077223 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2fjvl"] Oct 03 14:35:42 crc kubenswrapper[4962]: I1003 14:35:42.244971 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2d090d8-3ccd-4f67-91f5-8737048283dc" path="/var/lib/kubelet/pods/b2d090d8-3ccd-4f67-91f5-8737048283dc/volumes" Oct 03 14:35:42 crc kubenswrapper[4962]: I1003 14:35:42.341756 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9cb97f545-qzjwt" event={"ID":"48962e75-a3a8-46ef-8731-b3c692bb245e","Type":"ContainerDied","Data":"b3691b848384dcc992ed93a916c86b6f4c0f0b9f9b2a4d59ceb5a10f391e953f"} Oct 03 14:35:42 crc kubenswrapper[4962]: I1003 14:35:42.341765 4962 generic.go:334] "Generic (PLEG): container finished" podID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerID="b3691b848384dcc992ed93a916c86b6f4c0f0b9f9b2a4d59ceb5a10f391e953f" exitCode=0 Oct 03 14:35:42 crc kubenswrapper[4962]: I1003 14:35:42.341977 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g258p" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerName="registry-server" containerID="cri-o://901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282" gracePeriod=2 Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:42.836210 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:42.926764 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-catalog-content\") pod \"8695b814-fe22-4797-b53e-b0a1a25b9afe\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:42.927236 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-utilities\") pod \"8695b814-fe22-4797-b53e-b0a1a25b9afe\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:42.927300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltzd4\" (UniqueName: \"kubernetes.io/projected/8695b814-fe22-4797-b53e-b0a1a25b9afe-kube-api-access-ltzd4\") pod \"8695b814-fe22-4797-b53e-b0a1a25b9afe\" (UID: \"8695b814-fe22-4797-b53e-b0a1a25b9afe\") " Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:42.928071 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-utilities" (OuterVolumeSpecName: "utilities") pod "8695b814-fe22-4797-b53e-b0a1a25b9afe" (UID: "8695b814-fe22-4797-b53e-b0a1a25b9afe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:42.934231 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8695b814-fe22-4797-b53e-b0a1a25b9afe-kube-api-access-ltzd4" (OuterVolumeSpecName: "kube-api-access-ltzd4") pod "8695b814-fe22-4797-b53e-b0a1a25b9afe" (UID: "8695b814-fe22-4797-b53e-b0a1a25b9afe"). InnerVolumeSpecName "kube-api-access-ltzd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:42.970406 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8695b814-fe22-4797-b53e-b0a1a25b9afe" (UID: "8695b814-fe22-4797-b53e-b0a1a25b9afe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.030527 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.030559 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8695b814-fe22-4797-b53e-b0a1a25b9afe-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.030569 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltzd4\" (UniqueName: \"kubernetes.io/projected/8695b814-fe22-4797-b53e-b0a1a25b9afe-kube-api-access-ltzd4\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.352084 4962 generic.go:334] "Generic (PLEG): container finished" podID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerID="901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282" exitCode=0 Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.352123 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g258p" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.352134 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g258p" event={"ID":"8695b814-fe22-4797-b53e-b0a1a25b9afe","Type":"ContainerDied","Data":"901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282"} Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.352168 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g258p" event={"ID":"8695b814-fe22-4797-b53e-b0a1a25b9afe","Type":"ContainerDied","Data":"b2bc4af6de1c99ed5312dacb48f2ba442a3f553588223962cc8ae215e8036cd9"} Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.352188 4962 scope.go:117] "RemoveContainer" containerID="901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.370126 4962 scope.go:117] "RemoveContainer" containerID="d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.389391 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g258p"] Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.397033 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g258p"] Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.414179 4962 scope.go:117] "RemoveContainer" containerID="88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.439528 4962 scope.go:117] "RemoveContainer" containerID="901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282" Oct 03 14:35:43 crc kubenswrapper[4962]: E1003 14:35:43.440005 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282\": container with ID starting with 901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282 not found: ID does not exist" containerID="901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.440050 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282"} err="failed to get container status \"901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282\": rpc error: code = NotFound desc = could not find container \"901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282\": container with ID starting with 901a7918c0d4066168b6066bfbd55f3a22fe0bbda8469b62e05cd246da85d282 not found: ID does not exist" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.440088 4962 scope.go:117] "RemoveContainer" containerID="d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774" Oct 03 14:35:43 crc kubenswrapper[4962]: E1003 14:35:43.440628 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774\": container with ID starting with d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774 not found: ID does not exist" containerID="d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.440731 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774"} err="failed to get container status \"d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774\": rpc error: code = NotFound desc = could not find container \"d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774\": container with ID starting with d68ae78524ca93ef02146f5ff5a2474a0c26de130e779929dc55e1e0dc66a774 not found: ID does not exist" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.440810 4962 scope.go:117] "RemoveContainer" containerID="88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6" Oct 03 14:35:43 crc kubenswrapper[4962]: E1003 14:35:43.441410 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6\": container with ID starting with 88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6 not found: ID does not exist" containerID="88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.441497 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6"} err="failed to get container status \"88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6\": rpc error: code = NotFound desc = could not find container \"88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6\": container with ID starting with 88fde9703ddb9d813d1646fe6457e46cfc496c5682a907d905f8788639468af6 not found: ID does not exist" Oct 03 14:35:43 crc kubenswrapper[4962]: I1003 14:35:43.619599 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9cb97f545-qzjwt" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Oct 03 14:35:44 crc kubenswrapper[4962]: I1003 14:35:44.244164 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" path="/var/lib/kubelet/pods/8695b814-fe22-4797-b53e-b0a1a25b9afe/volumes" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.543432 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-675fb576f7-5q7qf"] Oct 03 14:35:46 crc kubenswrapper[4962]: E1003 14:35:46.545314 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerName="extract-content" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.545332 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerName="extract-content" Oct 03 14:35:46 crc kubenswrapper[4962]: E1003 14:35:46.545351 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerName="extract-utilities" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.545358 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerName="extract-utilities" Oct 03 14:35:46 crc kubenswrapper[4962]: E1003 14:35:46.545371 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerName="registry-server" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.545377 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerName="registry-server" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.545599 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8695b814-fe22-4797-b53e-b0a1a25b9afe" containerName="registry-server" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.546768 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.564110 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-675fb576f7-5q7qf"] Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.595597 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9eeb086e-a213-41d6-af7c-592d92f6feec-config-data\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.595682 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clj4m\" (UniqueName: \"kubernetes.io/projected/9eeb086e-a213-41d6-af7c-592d92f6feec-kube-api-access-clj4m\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.595716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9eeb086e-a213-41d6-af7c-592d92f6feec-horizon-secret-key\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.595754 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9eeb086e-a213-41d6-af7c-592d92f6feec-scripts\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.595800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9eeb086e-a213-41d6-af7c-592d92f6feec-logs\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.697446 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9eeb086e-a213-41d6-af7c-592d92f6feec-config-data\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.697500 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clj4m\" (UniqueName: \"kubernetes.io/projected/9eeb086e-a213-41d6-af7c-592d92f6feec-kube-api-access-clj4m\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.697531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9eeb086e-a213-41d6-af7c-592d92f6feec-horizon-secret-key\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.697592 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9eeb086e-a213-41d6-af7c-592d92f6feec-scripts\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.698478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9eeb086e-a213-41d6-af7c-592d92f6feec-scripts\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.698797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9eeb086e-a213-41d6-af7c-592d92f6feec-config-data\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.699304 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9eeb086e-a213-41d6-af7c-592d92f6feec-logs\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.699578 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9eeb086e-a213-41d6-af7c-592d92f6feec-logs\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.705161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9eeb086e-a213-41d6-af7c-592d92f6feec-horizon-secret-key\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.713901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clj4m\" (UniqueName: \"kubernetes.io/projected/9eeb086e-a213-41d6-af7c-592d92f6feec-kube-api-access-clj4m\") pod \"horizon-675fb576f7-5q7qf\" (UID: \"9eeb086e-a213-41d6-af7c-592d92f6feec\") " pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:46 crc kubenswrapper[4962]: I1003 14:35:46.884851 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:47 crc kubenswrapper[4962]: I1003 14:35:47.363893 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-675fb576f7-5q7qf"] Oct 03 14:35:47 crc kubenswrapper[4962]: I1003 14:35:47.389290 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-675fb576f7-5q7qf" event={"ID":"9eeb086e-a213-41d6-af7c-592d92f6feec","Type":"ContainerStarted","Data":"8ecd5063049146f39ff335a0d8e04b55eb4ea89db2c34e8eea7b633a79352ef4"} Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.216109 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-gw495"] Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.218652 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-gw495" Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.243677 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-gw495"] Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.246784 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/cd6b564a-9316-4aea-a8bc-d764e23fc7f8-kube-api-access-7brqc\") pod \"heat-db-create-gw495\" (UID: \"cd6b564a-9316-4aea-a8bc-d764e23fc7f8\") " pod="openstack/heat-db-create-gw495" Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.348540 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/cd6b564a-9316-4aea-a8bc-d764e23fc7f8-kube-api-access-7brqc\") pod \"heat-db-create-gw495\" (UID: \"cd6b564a-9316-4aea-a8bc-d764e23fc7f8\") " pod="openstack/heat-db-create-gw495" Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.368321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/cd6b564a-9316-4aea-a8bc-d764e23fc7f8-kube-api-access-7brqc\") pod \"heat-db-create-gw495\" (UID: \"cd6b564a-9316-4aea-a8bc-d764e23fc7f8\") " pod="openstack/heat-db-create-gw495" Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.405879 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-675fb576f7-5q7qf" event={"ID":"9eeb086e-a213-41d6-af7c-592d92f6feec","Type":"ContainerStarted","Data":"ad9981f273c443c8c0b946d137096ca34804c2a61735037dd00d504452530590"} Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.405915 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-675fb576f7-5q7qf" event={"ID":"9eeb086e-a213-41d6-af7c-592d92f6feec","Type":"ContainerStarted","Data":"45e551c60987648996e2b4bfa2d67b181dec106f52da9a9e6b2f40fff2f37b38"} Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.428138 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-675fb576f7-5q7qf" podStartSLOduration=2.42811382 podStartE2EDuration="2.42811382s" podCreationTimestamp="2025-10-03 14:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:35:48.425883761 +0000 UTC m=+6356.829781596" watchObservedRunningTime="2025-10-03 14:35:48.42811382 +0000 UTC m=+6356.832011655" Oct 03 14:35:48 crc kubenswrapper[4962]: I1003 14:35:48.548960 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-gw495" Oct 03 14:35:49 crc kubenswrapper[4962]: I1003 14:35:49.057315 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-gw495"] Oct 03 14:35:49 crc kubenswrapper[4962]: W1003 14:35:49.066913 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd6b564a_9316_4aea_a8bc_d764e23fc7f8.slice/crio-7e6eb2a98e46be4e8067ac215d31ec300bf5fa2b9781c96504763dc4f2699d67 WatchSource:0}: Error finding container 7e6eb2a98e46be4e8067ac215d31ec300bf5fa2b9781c96504763dc4f2699d67: Status 404 returned error can't find the container with id 7e6eb2a98e46be4e8067ac215d31ec300bf5fa2b9781c96504763dc4f2699d67 Oct 03 14:35:49 crc kubenswrapper[4962]: I1003 14:35:49.415949 4962 generic.go:334] "Generic (PLEG): container finished" podID="cd6b564a-9316-4aea-a8bc-d764e23fc7f8" containerID="20d504b0b4d59e6a77e8321bd3bb127b50930cf7551b1b1405b734d04f80bc45" exitCode=0 Oct 03 14:35:49 crc kubenswrapper[4962]: I1003 14:35:49.416032 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-gw495" event={"ID":"cd6b564a-9316-4aea-a8bc-d764e23fc7f8","Type":"ContainerDied","Data":"20d504b0b4d59e6a77e8321bd3bb127b50930cf7551b1b1405b734d04f80bc45"} Oct 03 14:35:49 crc kubenswrapper[4962]: I1003 14:35:49.417154 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-gw495" event={"ID":"cd6b564a-9316-4aea-a8bc-d764e23fc7f8","Type":"ContainerStarted","Data":"7e6eb2a98e46be4e8067ac215d31ec300bf5fa2b9781c96504763dc4f2699d67"} Oct 03 14:35:50 crc kubenswrapper[4962]: I1003 14:35:50.790492 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-gw495" Oct 03 14:35:50 crc kubenswrapper[4962]: I1003 14:35:50.901386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/cd6b564a-9316-4aea-a8bc-d764e23fc7f8-kube-api-access-7brqc\") pod \"cd6b564a-9316-4aea-a8bc-d764e23fc7f8\" (UID: \"cd6b564a-9316-4aea-a8bc-d764e23fc7f8\") " Oct 03 14:35:50 crc kubenswrapper[4962]: I1003 14:35:50.907942 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6b564a-9316-4aea-a8bc-d764e23fc7f8-kube-api-access-7brqc" (OuterVolumeSpecName: "kube-api-access-7brqc") pod "cd6b564a-9316-4aea-a8bc-d764e23fc7f8" (UID: "cd6b564a-9316-4aea-a8bc-d764e23fc7f8"). InnerVolumeSpecName "kube-api-access-7brqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:35:51 crc kubenswrapper[4962]: I1003 14:35:51.004357 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7brqc\" (UniqueName: \"kubernetes.io/projected/cd6b564a-9316-4aea-a8bc-d764e23fc7f8-kube-api-access-7brqc\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:51 crc kubenswrapper[4962]: I1003 14:35:51.446533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-gw495" event={"ID":"cd6b564a-9316-4aea-a8bc-d764e23fc7f8","Type":"ContainerDied","Data":"7e6eb2a98e46be4e8067ac215d31ec300bf5fa2b9781c96504763dc4f2699d67"} Oct 03 14:35:51 crc kubenswrapper[4962]: I1003 14:35:51.446583 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e6eb2a98e46be4e8067ac215d31ec300bf5fa2b9781c96504763dc4f2699d67" Oct 03 14:35:51 crc kubenswrapper[4962]: I1003 14:35:51.446664 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-gw495" Oct 03 14:35:53 crc kubenswrapper[4962]: I1003 14:35:53.619421 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9cb97f545-qzjwt" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Oct 03 14:35:56 crc kubenswrapper[4962]: I1003 14:35:56.885791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:56 crc kubenswrapper[4962]: I1003 14:35:56.886348 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.304846 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-dccc-account-create-7gz7x"] Oct 03 14:35:58 crc kubenswrapper[4962]: E1003 14:35:58.306466 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd6b564a-9316-4aea-a8bc-d764e23fc7f8" containerName="mariadb-database-create" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.306566 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6b564a-9316-4aea-a8bc-d764e23fc7f8" containerName="mariadb-database-create" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.306915 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd6b564a-9316-4aea-a8bc-d764e23fc7f8" containerName="mariadb-database-create" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.307885 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-dccc-account-create-7gz7x" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.316176 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.323238 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-dccc-account-create-7gz7x"] Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.379375 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cppk\" (UniqueName: \"kubernetes.io/projected/a9adc7f7-76c2-4c5e-aca6-f8ace634f403-kube-api-access-2cppk\") pod \"heat-dccc-account-create-7gz7x\" (UID: \"a9adc7f7-76c2-4c5e-aca6-f8ace634f403\") " pod="openstack/heat-dccc-account-create-7gz7x" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.481514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cppk\" (UniqueName: \"kubernetes.io/projected/a9adc7f7-76c2-4c5e-aca6-f8ace634f403-kube-api-access-2cppk\") pod \"heat-dccc-account-create-7gz7x\" (UID: \"a9adc7f7-76c2-4c5e-aca6-f8ace634f403\") " pod="openstack/heat-dccc-account-create-7gz7x" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.516053 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cppk\" (UniqueName: \"kubernetes.io/projected/a9adc7f7-76c2-4c5e-aca6-f8ace634f403-kube-api-access-2cppk\") pod \"heat-dccc-account-create-7gz7x\" (UID: \"a9adc7f7-76c2-4c5e-aca6-f8ace634f403\") " pod="openstack/heat-dccc-account-create-7gz7x" Oct 03 14:35:58 crc kubenswrapper[4962]: I1003 14:35:58.630117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-dccc-account-create-7gz7x" Oct 03 14:35:59 crc kubenswrapper[4962]: I1003 14:35:59.091754 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-dccc-account-create-7gz7x"] Oct 03 14:35:59 crc kubenswrapper[4962]: W1003 14:35:59.097146 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9adc7f7_76c2_4c5e_aca6_f8ace634f403.slice/crio-61933c1fe3b0c419bbb6fe5e5e6ee9c1faf747b111ea2b23464ff03bbbe542fa WatchSource:0}: Error finding container 61933c1fe3b0c419bbb6fe5e5e6ee9c1faf747b111ea2b23464ff03bbbe542fa: Status 404 returned error can't find the container with id 61933c1fe3b0c419bbb6fe5e5e6ee9c1faf747b111ea2b23464ff03bbbe542fa Oct 03 14:35:59 crc kubenswrapper[4962]: I1003 14:35:59.513964 4962 generic.go:334] "Generic (PLEG): container finished" podID="a9adc7f7-76c2-4c5e-aca6-f8ace634f403" containerID="3c28c0c458724d1145069d0adbd5091c578364b762c0523c131500c2920fa0e3" exitCode=0 Oct 03 14:35:59 crc kubenswrapper[4962]: I1003 14:35:59.514011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-dccc-account-create-7gz7x" event={"ID":"a9adc7f7-76c2-4c5e-aca6-f8ace634f403","Type":"ContainerDied","Data":"3c28c0c458724d1145069d0adbd5091c578364b762c0523c131500c2920fa0e3"} Oct 03 14:35:59 crc kubenswrapper[4962]: I1003 14:35:59.514297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-dccc-account-create-7gz7x" event={"ID":"a9adc7f7-76c2-4c5e-aca6-f8ace634f403","Type":"ContainerStarted","Data":"61933c1fe3b0c419bbb6fe5e5e6ee9c1faf747b111ea2b23464ff03bbbe542fa"} Oct 03 14:36:00 crc kubenswrapper[4962]: I1003 14:36:00.881443 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-dccc-account-create-7gz7x" Oct 03 14:36:00 crc kubenswrapper[4962]: I1003 14:36:00.927502 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cppk\" (UniqueName: \"kubernetes.io/projected/a9adc7f7-76c2-4c5e-aca6-f8ace634f403-kube-api-access-2cppk\") pod \"a9adc7f7-76c2-4c5e-aca6-f8ace634f403\" (UID: \"a9adc7f7-76c2-4c5e-aca6-f8ace634f403\") " Oct 03 14:36:00 crc kubenswrapper[4962]: I1003 14:36:00.933307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9adc7f7-76c2-4c5e-aca6-f8ace634f403-kube-api-access-2cppk" (OuterVolumeSpecName: "kube-api-access-2cppk") pod "a9adc7f7-76c2-4c5e-aca6-f8ace634f403" (UID: "a9adc7f7-76c2-4c5e-aca6-f8ace634f403"). InnerVolumeSpecName "kube-api-access-2cppk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:36:01 crc kubenswrapper[4962]: I1003 14:36:01.030571 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cppk\" (UniqueName: \"kubernetes.io/projected/a9adc7f7-76c2-4c5e-aca6-f8ace634f403-kube-api-access-2cppk\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:01 crc kubenswrapper[4962]: I1003 14:36:01.535129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-dccc-account-create-7gz7x" event={"ID":"a9adc7f7-76c2-4c5e-aca6-f8ace634f403","Type":"ContainerDied","Data":"61933c1fe3b0c419bbb6fe5e5e6ee9c1faf747b111ea2b23464ff03bbbe542fa"} Oct 03 14:36:01 crc kubenswrapper[4962]: I1003 14:36:01.535968 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61933c1fe3b0c419bbb6fe5e5e6ee9c1faf747b111ea2b23464ff03bbbe542fa" Oct 03 14:36:01 crc kubenswrapper[4962]: I1003 14:36:01.536132 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-dccc-account-create-7gz7x" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.351251 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-mlxbc"] Oct 03 14:36:03 crc kubenswrapper[4962]: E1003 14:36:03.352015 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9adc7f7-76c2-4c5e-aca6-f8ace634f403" containerName="mariadb-account-create" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.352036 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9adc7f7-76c2-4c5e-aca6-f8ace634f403" containerName="mariadb-account-create" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.352238 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9adc7f7-76c2-4c5e-aca6-f8ace634f403" containerName="mariadb-account-create" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.352949 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.356228 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-n8pqf" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.356485 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.361734 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-mlxbc"] Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.379597 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-config-data\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.379760 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-combined-ca-bundle\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.379808 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7xqj\" (UniqueName: \"kubernetes.io/projected/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-kube-api-access-n7xqj\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.482010 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7xqj\" (UniqueName: \"kubernetes.io/projected/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-kube-api-access-n7xqj\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.482423 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-config-data\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.482588 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-combined-ca-bundle\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.491333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-config-data\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.494184 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-combined-ca-bundle\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.500150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7xqj\" (UniqueName: \"kubernetes.io/projected/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-kube-api-access-n7xqj\") pod \"heat-db-sync-mlxbc\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.619570 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9cb97f545-qzjwt" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.619791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:36:03 crc kubenswrapper[4962]: I1003 14:36:03.686597 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:04 crc kubenswrapper[4962]: I1003 14:36:04.184472 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-mlxbc"] Oct 03 14:36:04 crc kubenswrapper[4962]: I1003 14:36:04.562746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-mlxbc" event={"ID":"fbbf01e6-b3e6-4e62-aca9-7f905967ed80","Type":"ContainerStarted","Data":"48874432a373803b4de45f48e990e50d485b4976f81a1db70f3886d6c73e989c"} Oct 03 14:36:08 crc kubenswrapper[4962]: I1003 14:36:08.603454 4962 generic.go:334] "Generic (PLEG): container finished" podID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerID="d9052a8ca693c11478efee47005036d820709cc1af52a6696c7e0795d2079cc7" exitCode=137 Oct 03 14:36:08 crc kubenswrapper[4962]: I1003 14:36:08.604067 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9cb97f545-qzjwt" event={"ID":"48962e75-a3a8-46ef-8731-b3c692bb245e","Type":"ContainerDied","Data":"d9052a8ca693c11478efee47005036d820709cc1af52a6696c7e0795d2079cc7"} Oct 03 14:36:08 crc kubenswrapper[4962]: I1003 14:36:08.731823 4962 scope.go:117] "RemoveContainer" containerID="99801711faa9ecf8a74d9a4a5e1a95dda8c79c466fffc9d77f533103e27747ad" Oct 03 14:36:08 crc kubenswrapper[4962]: I1003 14:36:08.758985 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:36:10 crc kubenswrapper[4962]: I1003 14:36:10.959144 4962 scope.go:117] "RemoveContainer" containerID="d31bf978cdfb5b39c756569726205c7d17011ee2009ce1800b0242ebce14a3fb" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.015492 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-675fb576f7-5q7qf" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.082097 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dcfc8787c-gf2hf"] Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.082866 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5dcfc8787c-gf2hf" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon-log" containerID="cri-o://e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa" gracePeriod=30 Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.083192 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5dcfc8787c-gf2hf" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon" containerID="cri-o://90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a" gracePeriod=30 Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.212993 4962 scope.go:117] "RemoveContainer" containerID="a329031bb0d35a598b4a8b1d68885bd88e39d01f60a838603e6c6659120ac3b2" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.442889 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.488622 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48962e75-a3a8-46ef-8731-b3c692bb245e-logs\") pod \"48962e75-a3a8-46ef-8731-b3c692bb245e\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.488758 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-config-data\") pod \"48962e75-a3a8-46ef-8731-b3c692bb245e\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.488817 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-scripts\") pod \"48962e75-a3a8-46ef-8731-b3c692bb245e\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.488873 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48962e75-a3a8-46ef-8731-b3c692bb245e-horizon-secret-key\") pod \"48962e75-a3a8-46ef-8731-b3c692bb245e\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.489029 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m8pt\" (UniqueName: \"kubernetes.io/projected/48962e75-a3a8-46ef-8731-b3c692bb245e-kube-api-access-2m8pt\") pod \"48962e75-a3a8-46ef-8731-b3c692bb245e\" (UID: \"48962e75-a3a8-46ef-8731-b3c692bb245e\") " Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.496271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48962e75-a3a8-46ef-8731-b3c692bb245e-logs" (OuterVolumeSpecName: "logs") pod "48962e75-a3a8-46ef-8731-b3c692bb245e" (UID: "48962e75-a3a8-46ef-8731-b3c692bb245e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.501817 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48962e75-a3a8-46ef-8731-b3c692bb245e-kube-api-access-2m8pt" (OuterVolumeSpecName: "kube-api-access-2m8pt") pod "48962e75-a3a8-46ef-8731-b3c692bb245e" (UID: "48962e75-a3a8-46ef-8731-b3c692bb245e"). InnerVolumeSpecName "kube-api-access-2m8pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.502009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48962e75-a3a8-46ef-8731-b3c692bb245e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "48962e75-a3a8-46ef-8731-b3c692bb245e" (UID: "48962e75-a3a8-46ef-8731-b3c692bb245e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.513879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-scripts" (OuterVolumeSpecName: "scripts") pod "48962e75-a3a8-46ef-8731-b3c692bb245e" (UID: "48962e75-a3a8-46ef-8731-b3c692bb245e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.515941 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-config-data" (OuterVolumeSpecName: "config-data") pod "48962e75-a3a8-46ef-8731-b3c692bb245e" (UID: "48962e75-a3a8-46ef-8731-b3c692bb245e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.591069 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m8pt\" (UniqueName: \"kubernetes.io/projected/48962e75-a3a8-46ef-8731-b3c692bb245e-kube-api-access-2m8pt\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.591117 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48962e75-a3a8-46ef-8731-b3c692bb245e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.591132 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.591145 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48962e75-a3a8-46ef-8731-b3c692bb245e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.591157 4962 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48962e75-a3a8-46ef-8731-b3c692bb245e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.633342 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9cb97f545-qzjwt" event={"ID":"48962e75-a3a8-46ef-8731-b3c692bb245e","Type":"ContainerDied","Data":"e33f027110beae95ea416ee90a6656513e966dc89a3080fc026cdbaf333a6dcd"} Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.633398 4962 scope.go:117] "RemoveContainer" containerID="b3691b848384dcc992ed93a916c86b6f4c0f0b9f9b2a4d59ceb5a10f391e953f" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.633517 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9cb97f545-qzjwt" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.641227 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-mlxbc" event={"ID":"fbbf01e6-b3e6-4e62-aca9-7f905967ed80","Type":"ContainerStarted","Data":"faccd713a09335c3cee17c137068d8227a0197bda26b9189a6370ffdd1b7855d"} Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.667888 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-mlxbc" podStartSLOduration=1.838586638 podStartE2EDuration="8.667852428s" podCreationTimestamp="2025-10-03 14:36:03 +0000 UTC" firstStartedPulling="2025-10-03 14:36:04.196054777 +0000 UTC m=+6372.599952612" lastFinishedPulling="2025-10-03 14:36:11.025320567 +0000 UTC m=+6379.429218402" observedRunningTime="2025-10-03 14:36:11.660091311 +0000 UTC m=+6380.063989156" watchObservedRunningTime="2025-10-03 14:36:11.667852428 +0000 UTC m=+6380.071750263" Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.686650 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9cb97f545-qzjwt"] Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.698296 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9cb97f545-qzjwt"] Oct 03 14:36:11 crc kubenswrapper[4962]: I1003 14:36:11.812528 4962 scope.go:117] "RemoveContainer" containerID="d9052a8ca693c11478efee47005036d820709cc1af52a6696c7e0795d2079cc7" Oct 03 14:36:12 crc kubenswrapper[4962]: I1003 14:36:12.237686 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" path="/var/lib/kubelet/pods/48962e75-a3a8-46ef-8731-b3c692bb245e/volumes" Oct 03 14:36:14 crc kubenswrapper[4962]: I1003 14:36:14.235480 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5dcfc8787c-gf2hf" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:53666->10.217.1.110:8080: read: connection reset by peer" Oct 03 14:36:14 crc kubenswrapper[4962]: I1003 14:36:14.669683 4962 generic.go:334] "Generic (PLEG): container finished" podID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerID="90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a" exitCode=0 Oct 03 14:36:14 crc kubenswrapper[4962]: I1003 14:36:14.669804 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dcfc8787c-gf2hf" event={"ID":"78dcd167-b38a-4da9-a9c6-63e48eed5832","Type":"ContainerDied","Data":"90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a"} Oct 03 14:36:14 crc kubenswrapper[4962]: I1003 14:36:14.671870 4962 generic.go:334] "Generic (PLEG): container finished" podID="fbbf01e6-b3e6-4e62-aca9-7f905967ed80" containerID="faccd713a09335c3cee17c137068d8227a0197bda26b9189a6370ffdd1b7855d" exitCode=0 Oct 03 14:36:14 crc kubenswrapper[4962]: I1003 14:36:14.671897 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-mlxbc" event={"ID":"fbbf01e6-b3e6-4e62-aca9-7f905967ed80","Type":"ContainerDied","Data":"faccd713a09335c3cee17c137068d8227a0197bda26b9189a6370ffdd1b7855d"} Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.066039 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.188819 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7xqj\" (UniqueName: \"kubernetes.io/projected/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-kube-api-access-n7xqj\") pod \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.188959 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-combined-ca-bundle\") pod \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.188995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-config-data\") pod \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\" (UID: \"fbbf01e6-b3e6-4e62-aca9-7f905967ed80\") " Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.195063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-kube-api-access-n7xqj" (OuterVolumeSpecName: "kube-api-access-n7xqj") pod "fbbf01e6-b3e6-4e62-aca9-7f905967ed80" (UID: "fbbf01e6-b3e6-4e62-aca9-7f905967ed80"). InnerVolumeSpecName "kube-api-access-n7xqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.225682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbbf01e6-b3e6-4e62-aca9-7f905967ed80" (UID: "fbbf01e6-b3e6-4e62-aca9-7f905967ed80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.263410 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-config-data" (OuterVolumeSpecName: "config-data") pod "fbbf01e6-b3e6-4e62-aca9-7f905967ed80" (UID: "fbbf01e6-b3e6-4e62-aca9-7f905967ed80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.291829 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7xqj\" (UniqueName: \"kubernetes.io/projected/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-kube-api-access-n7xqj\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.291865 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.291878 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbf01e6-b3e6-4e62-aca9-7f905967ed80-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.689952 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-mlxbc" event={"ID":"fbbf01e6-b3e6-4e62-aca9-7f905967ed80","Type":"ContainerDied","Data":"48874432a373803b4de45f48e990e50d485b4976f81a1db70f3886d6c73e989c"} Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.690333 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48874432a373803b4de45f48e990e50d485b4976f81a1db70f3886d6c73e989c" Oct 03 14:36:16 crc kubenswrapper[4962]: I1003 14:36:16.690047 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-mlxbc" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.958403 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-77d94b4755-qkln9"] Oct 03 14:36:17 crc kubenswrapper[4962]: E1003 14:36:17.959209 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon-log" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.959229 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon-log" Oct 03 14:36:17 crc kubenswrapper[4962]: E1003 14:36:17.959298 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbf01e6-b3e6-4e62-aca9-7f905967ed80" containerName="heat-db-sync" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.959309 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbf01e6-b3e6-4e62-aca9-7f905967ed80" containerName="heat-db-sync" Oct 03 14:36:17 crc kubenswrapper[4962]: E1003 14:36:17.959334 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.959343 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.959671 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.959689 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="48962e75-a3a8-46ef-8731-b3c692bb245e" containerName="horizon-log" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.959716 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbbf01e6-b3e6-4e62-aca9-7f905967ed80" containerName="heat-db-sync" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.960919 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.964080 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.964329 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-n8pqf" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.964541 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 03 14:36:17 crc kubenswrapper[4962]: I1003 14:36:17.999377 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-77d94b4755-qkln9"] Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.022938 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b82z\" (UniqueName: \"kubernetes.io/projected/89db9039-38fa-408a-b07b-7fdd4f55c6bd-kube-api-access-5b82z\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.023027 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-config-data\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.023114 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-combined-ca-bundle\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.023151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-config-data-custom\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.028973 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-69556c5889-9hbl8"] Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.030699 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.032983 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.074229 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-69556c5889-9hbl8"] Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.110680 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-dc84b6f66-2tzcd"] Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.112029 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.117977 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.123430 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-dc84b6f66-2tzcd"] Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.124957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsxf\" (UniqueName: \"kubernetes.io/projected/c42e9aa3-5ec6-4434-8e77-14ed49101590-kube-api-access-bqsxf\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.125001 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b82z\" (UniqueName: \"kubernetes.io/projected/89db9039-38fa-408a-b07b-7fdd4f55c6bd-kube-api-access-5b82z\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.125053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-config-data\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.125074 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-combined-ca-bundle\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.125116 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-config-data-custom\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.125144 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-config-data\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.125165 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-combined-ca-bundle\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.125190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-config-data-custom\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.139192 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-config-data-custom\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.140357 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-config-data\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.147095 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89db9039-38fa-408a-b07b-7fdd4f55c6bd-combined-ca-bundle\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.153009 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b82z\" (UniqueName: \"kubernetes.io/projected/89db9039-38fa-408a-b07b-7fdd4f55c6bd-kube-api-access-5b82z\") pod \"heat-engine-77d94b4755-qkln9\" (UID: \"89db9039-38fa-408a-b07b-7fdd4f55c6bd\") " pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.226556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-config-data\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.226683 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzk77\" (UniqueName: \"kubernetes.io/projected/43f66771-9c85-4e51-81c8-ff54001c8702-kube-api-access-bzk77\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.226756 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsxf\" (UniqueName: \"kubernetes.io/projected/c42e9aa3-5ec6-4434-8e77-14ed49101590-kube-api-access-bqsxf\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.226793 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-config-data-custom\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.226828 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-combined-ca-bundle\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.226849 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-combined-ca-bundle\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.226908 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-config-data-custom\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.226945 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-config-data\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.232500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-config-data\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.233209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-combined-ca-bundle\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.238280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42e9aa3-5ec6-4434-8e77-14ed49101590-config-data-custom\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.257500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsxf\" (UniqueName: \"kubernetes.io/projected/c42e9aa3-5ec6-4434-8e77-14ed49101590-kube-api-access-bqsxf\") pod \"heat-api-69556c5889-9hbl8\" (UID: \"c42e9aa3-5ec6-4434-8e77-14ed49101590\") " pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.295870 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.328663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-combined-ca-bundle\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.329074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-config-data\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.329119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzk77\" (UniqueName: \"kubernetes.io/projected/43f66771-9c85-4e51-81c8-ff54001c8702-kube-api-access-bzk77\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.329178 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-config-data-custom\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.332747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-config-data-custom\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.333071 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-config-data\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.334275 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f66771-9c85-4e51-81c8-ff54001c8702-combined-ca-bundle\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.346760 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzk77\" (UniqueName: \"kubernetes.io/projected/43f66771-9c85-4e51-81c8-ff54001c8702-kube-api-access-bzk77\") pod \"heat-cfnapi-dc84b6f66-2tzcd\" (UID: \"43f66771-9c85-4e51-81c8-ff54001c8702\") " pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.349950 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.522271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.809182 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-77d94b4755-qkln9"] Oct 03 14:36:18 crc kubenswrapper[4962]: I1003 14:36:18.920250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-69556c5889-9hbl8"] Oct 03 14:36:19 crc kubenswrapper[4962]: I1003 14:36:19.088886 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-dc84b6f66-2tzcd"] Oct 03 14:36:19 crc kubenswrapper[4962]: W1003 14:36:19.094203 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43f66771_9c85_4e51_81c8_ff54001c8702.slice/crio-10b53b4c812b1f25e7faeddfe215616b1f0c07df1b488a01930a8103b19ab500 WatchSource:0}: Error finding container 10b53b4c812b1f25e7faeddfe215616b1f0c07df1b488a01930a8103b19ab500: Status 404 returned error can't find the container with id 10b53b4c812b1f25e7faeddfe215616b1f0c07df1b488a01930a8103b19ab500 Oct 03 14:36:19 crc kubenswrapper[4962]: I1003 14:36:19.735701 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" event={"ID":"43f66771-9c85-4e51-81c8-ff54001c8702","Type":"ContainerStarted","Data":"10b53b4c812b1f25e7faeddfe215616b1f0c07df1b488a01930a8103b19ab500"} Oct 03 14:36:19 crc kubenswrapper[4962]: I1003 14:36:19.738283 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-69556c5889-9hbl8" event={"ID":"c42e9aa3-5ec6-4434-8e77-14ed49101590","Type":"ContainerStarted","Data":"c40fa18346890344a1dc62f83f10839ccdcbe9f10073078e63cb3153acdff5b3"} Oct 03 14:36:19 crc kubenswrapper[4962]: I1003 14:36:19.748295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-77d94b4755-qkln9" event={"ID":"89db9039-38fa-408a-b07b-7fdd4f55c6bd","Type":"ContainerStarted","Data":"7b3d127e5f4856c7d8697f63c4fd9f9bd0fe4669c40895255f6486d3b9519b04"} Oct 03 14:36:19 crc kubenswrapper[4962]: I1003 14:36:19.748341 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-77d94b4755-qkln9" event={"ID":"89db9039-38fa-408a-b07b-7fdd4f55c6bd","Type":"ContainerStarted","Data":"d36c635f7cdbc5219d17df75153aade766235f0e6f9697ee87a0c5e09edea32e"} Oct 03 14:36:19 crc kubenswrapper[4962]: I1003 14:36:19.748519 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:19 crc kubenswrapper[4962]: I1003 14:36:19.774241 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-77d94b4755-qkln9" podStartSLOduration=2.7742193889999998 podStartE2EDuration="2.774219389s" podCreationTimestamp="2025-10-03 14:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:36:19.764467299 +0000 UTC m=+6388.168365134" watchObservedRunningTime="2025-10-03 14:36:19.774219389 +0000 UTC m=+6388.178117224" Oct 03 14:36:21 crc kubenswrapper[4962]: I1003 14:36:21.769955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" event={"ID":"43f66771-9c85-4e51-81c8-ff54001c8702","Type":"ContainerStarted","Data":"01c48c5882c6e04c23e739b2f497b82bea494cc9722d57102f6df8d435fb5cb6"} Oct 03 14:36:21 crc kubenswrapper[4962]: I1003 14:36:21.770979 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:21 crc kubenswrapper[4962]: I1003 14:36:21.771925 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-69556c5889-9hbl8" event={"ID":"c42e9aa3-5ec6-4434-8e77-14ed49101590","Type":"ContainerStarted","Data":"4b353dd52a11b280108e0d5e6df667e26655c7b888326d458b8b99fe22008e13"} Oct 03 14:36:21 crc kubenswrapper[4962]: I1003 14:36:21.772028 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:21 crc kubenswrapper[4962]: I1003 14:36:21.799889 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" podStartSLOduration=1.931545527 podStartE2EDuration="3.79985676s" podCreationTimestamp="2025-10-03 14:36:18 +0000 UTC" firstStartedPulling="2025-10-03 14:36:19.097257249 +0000 UTC m=+6387.501155074" lastFinishedPulling="2025-10-03 14:36:20.965568472 +0000 UTC m=+6389.369466307" observedRunningTime="2025-10-03 14:36:21.790778597 +0000 UTC m=+6390.194676452" watchObservedRunningTime="2025-10-03 14:36:21.79985676 +0000 UTC m=+6390.203754595" Oct 03 14:36:24 crc kubenswrapper[4962]: I1003 14:36:24.195312 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5dcfc8787c-gf2hf" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Oct 03 14:36:29 crc kubenswrapper[4962]: I1003 14:36:29.721478 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-69556c5889-9hbl8" Oct 03 14:36:29 crc kubenswrapper[4962]: I1003 14:36:29.742374 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-69556c5889-9hbl8" podStartSLOduration=10.725869532 podStartE2EDuration="12.742356708s" podCreationTimestamp="2025-10-03 14:36:17 +0000 UTC" firstStartedPulling="2025-10-03 14:36:18.944892664 +0000 UTC m=+6387.348790499" lastFinishedPulling="2025-10-03 14:36:20.96137984 +0000 UTC m=+6389.365277675" observedRunningTime="2025-10-03 14:36:21.818546158 +0000 UTC m=+6390.222443993" watchObservedRunningTime="2025-10-03 14:36:29.742356708 +0000 UTC m=+6398.146254543" Oct 03 14:36:29 crc kubenswrapper[4962]: I1003 14:36:29.895018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-dc84b6f66-2tzcd" Oct 03 14:36:34 crc kubenswrapper[4962]: I1003 14:36:34.195706 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5dcfc8787c-gf2hf" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Oct 03 14:36:34 crc kubenswrapper[4962]: I1003 14:36:34.196398 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.037607 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dmq9v"] Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.047306 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fvgd9"] Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.055940 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-wrfxq"] Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.065315 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-wrfxq"] Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.073162 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fvgd9"] Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.080503 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dmq9v"] Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.270962 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ad472c-890f-4720-bd48-5b637f821344" path="/var/lib/kubelet/pods/10ad472c-890f-4720-bd48-5b637f821344/volumes" Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.271529 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b03ca85-b639-4480-b117-ba6dce52030f" path="/var/lib/kubelet/pods/6b03ca85-b639-4480-b117-ba6dce52030f/volumes" Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.272069 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b590b3c6-3b78-42fa-af59-6c7379135360" path="/var/lib/kubelet/pods/b590b3c6-3b78-42fa-af59-6c7379135360/volumes" Oct 03 14:36:38 crc kubenswrapper[4962]: I1003 14:36:38.337459 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-77d94b4755-qkln9" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.520837 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.634624 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-config-data\") pod \"78dcd167-b38a-4da9-a9c6-63e48eed5832\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.635081 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78dcd167-b38a-4da9-a9c6-63e48eed5832-logs\") pod \"78dcd167-b38a-4da9-a9c6-63e48eed5832\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.635228 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78dcd167-b38a-4da9-a9c6-63e48eed5832-horizon-secret-key\") pod \"78dcd167-b38a-4da9-a9c6-63e48eed5832\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.635333 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-scripts\") pod \"78dcd167-b38a-4da9-a9c6-63e48eed5832\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.635453 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxtm6\" (UniqueName: \"kubernetes.io/projected/78dcd167-b38a-4da9-a9c6-63e48eed5832-kube-api-access-xxtm6\") pod \"78dcd167-b38a-4da9-a9c6-63e48eed5832\" (UID: \"78dcd167-b38a-4da9-a9c6-63e48eed5832\") " Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.636737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dcd167-b38a-4da9-a9c6-63e48eed5832-logs" (OuterVolumeSpecName: "logs") pod "78dcd167-b38a-4da9-a9c6-63e48eed5832" (UID: "78dcd167-b38a-4da9-a9c6-63e48eed5832"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.639865 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78dcd167-b38a-4da9-a9c6-63e48eed5832-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "78dcd167-b38a-4da9-a9c6-63e48eed5832" (UID: "78dcd167-b38a-4da9-a9c6-63e48eed5832"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.640189 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78dcd167-b38a-4da9-a9c6-63e48eed5832-kube-api-access-xxtm6" (OuterVolumeSpecName: "kube-api-access-xxtm6") pod "78dcd167-b38a-4da9-a9c6-63e48eed5832" (UID: "78dcd167-b38a-4da9-a9c6-63e48eed5832"). InnerVolumeSpecName "kube-api-access-xxtm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.664457 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-scripts" (OuterVolumeSpecName: "scripts") pod "78dcd167-b38a-4da9-a9c6-63e48eed5832" (UID: "78dcd167-b38a-4da9-a9c6-63e48eed5832"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.665269 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-config-data" (OuterVolumeSpecName: "config-data") pod "78dcd167-b38a-4da9-a9c6-63e48eed5832" (UID: "78dcd167-b38a-4da9-a9c6-63e48eed5832"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.737551 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78dcd167-b38a-4da9-a9c6-63e48eed5832-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.737585 4962 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78dcd167-b38a-4da9-a9c6-63e48eed5832-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.737595 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.737604 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxtm6\" (UniqueName: \"kubernetes.io/projected/78dcd167-b38a-4da9-a9c6-63e48eed5832-kube-api-access-xxtm6\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.737613 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78dcd167-b38a-4da9-a9c6-63e48eed5832-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.948312 4962 generic.go:334] "Generic (PLEG): container finished" podID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerID="e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa" exitCode=137 Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.948374 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dcfc8787c-gf2hf" event={"ID":"78dcd167-b38a-4da9-a9c6-63e48eed5832","Type":"ContainerDied","Data":"e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa"} Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.948390 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dcfc8787c-gf2hf" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.948412 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dcfc8787c-gf2hf" event={"ID":"78dcd167-b38a-4da9-a9c6-63e48eed5832","Type":"ContainerDied","Data":"1e1b7547583f1c44cb91845d344552594e5cb0592d23527ae6a200ab1dedbad6"} Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.948439 4962 scope.go:117] "RemoveContainer" containerID="90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a" Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.987096 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dcfc8787c-gf2hf"] Oct 03 14:36:41 crc kubenswrapper[4962]: I1003 14:36:41.995105 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5dcfc8787c-gf2hf"] Oct 03 14:36:42 crc kubenswrapper[4962]: I1003 14:36:42.128160 4962 scope.go:117] "RemoveContainer" containerID="e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa" Oct 03 14:36:42 crc kubenswrapper[4962]: I1003 14:36:42.152984 4962 scope.go:117] "RemoveContainer" containerID="90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a" Oct 03 14:36:42 crc kubenswrapper[4962]: E1003 14:36:42.153434 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a\": container with ID starting with 90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a not found: ID does not exist" containerID="90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a" Oct 03 14:36:42 crc kubenswrapper[4962]: I1003 14:36:42.153477 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a"} err="failed to get container status \"90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a\": rpc error: code = NotFound desc = could not find container \"90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a\": container with ID starting with 90e55e78eef3f9fa1f0cf90b987de2073157b9275f8e66ae908f2792324b0d8a not found: ID does not exist" Oct 03 14:36:42 crc kubenswrapper[4962]: I1003 14:36:42.153503 4962 scope.go:117] "RemoveContainer" containerID="e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa" Oct 03 14:36:42 crc kubenswrapper[4962]: E1003 14:36:42.153874 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa\": container with ID starting with e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa not found: ID does not exist" containerID="e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa" Oct 03 14:36:42 crc kubenswrapper[4962]: I1003 14:36:42.153917 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa"} err="failed to get container status \"e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa\": rpc error: code = NotFound desc = could not find container \"e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa\": container with ID starting with e9be0cd3d3b34f1841b1ce5732b9b56e4bb767747c6952425399c719f4741ffa not found: ID does not exist" Oct 03 14:36:42 crc kubenswrapper[4962]: I1003 14:36:42.240753 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" path="/var/lib/kubelet/pods/78dcd167-b38a-4da9-a9c6-63e48eed5832/volumes" Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.039307 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-aaaf-account-create-vzmss"] Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.046689 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4c80-account-create-pjt4h"] Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.055943 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-aaaf-account-create-vzmss"] Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.064282 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-832c-account-create-7q5x2"] Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.070876 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4c80-account-create-pjt4h"] Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.077319 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-832c-account-create-7q5x2"] Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.241141 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559da102-fa5c-4c60-89e9-e54596f5172a" path="/var/lib/kubelet/pods/559da102-fa5c-4c60-89e9-e54596f5172a/volumes" Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.243257 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5989ecf4-c65d-4310-940c-468b5d2fa698" path="/var/lib/kubelet/pods/5989ecf4-c65d-4310-940c-468b5d2fa698/volumes" Oct 03 14:36:48 crc kubenswrapper[4962]: I1003 14:36:48.244042 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc15827a-60c3-4aba-bdbe-234d52d256b1" path="/var/lib/kubelet/pods/cc15827a-60c3-4aba-bdbe-234d52d256b1/volumes" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.640297 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dtknd"] Oct 03 14:36:49 crc kubenswrapper[4962]: E1003 14:36:49.641168 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.641189 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon" Oct 03 14:36:49 crc kubenswrapper[4962]: E1003 14:36:49.641212 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon-log" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.641222 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon-log" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.641479 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon-log" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.641512 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="78dcd167-b38a-4da9-a9c6-63e48eed5832" containerName="horizon" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.643346 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.653425 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dtknd"] Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.686875 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27stn\" (UniqueName: \"kubernetes.io/projected/9f3f2a62-c0d7-450c-85d8-afd52aecc899-kube-api-access-27stn\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.686936 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-utilities\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.687116 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-catalog-content\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.789313 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27stn\" (UniqueName: \"kubernetes.io/projected/9f3f2a62-c0d7-450c-85d8-afd52aecc899-kube-api-access-27stn\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.789390 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-utilities\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.789447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-catalog-content\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.790058 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-utilities\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.790148 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-catalog-content\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.813906 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27stn\" (UniqueName: \"kubernetes.io/projected/9f3f2a62-c0d7-450c-85d8-afd52aecc899-kube-api-access-27stn\") pod \"redhat-operators-dtknd\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:49 crc kubenswrapper[4962]: I1003 14:36:49.970354 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:50 crc kubenswrapper[4962]: I1003 14:36:50.511696 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dtknd"] Oct 03 14:36:51 crc kubenswrapper[4962]: I1003 14:36:51.042406 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerID="46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04" exitCode=0 Oct 03 14:36:51 crc kubenswrapper[4962]: I1003 14:36:51.042471 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtknd" event={"ID":"9f3f2a62-c0d7-450c-85d8-afd52aecc899","Type":"ContainerDied","Data":"46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04"} Oct 03 14:36:51 crc kubenswrapper[4962]: I1003 14:36:51.042722 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtknd" event={"ID":"9f3f2a62-c0d7-450c-85d8-afd52aecc899","Type":"ContainerStarted","Data":"d35f940c67545f1ba36a9b1cd63989ad545160a5a71b55f42bc69733c26bb1b4"} Oct 03 14:36:53 crc kubenswrapper[4962]: I1003 14:36:53.062276 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerID="51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93" exitCode=0 Oct 03 14:36:53 crc kubenswrapper[4962]: I1003 14:36:53.062615 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtknd" event={"ID":"9f3f2a62-c0d7-450c-85d8-afd52aecc899","Type":"ContainerDied","Data":"51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93"} Oct 03 14:36:54 crc kubenswrapper[4962]: I1003 14:36:54.660048 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:36:54 crc kubenswrapper[4962]: I1003 14:36:54.660449 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:36:55 crc kubenswrapper[4962]: I1003 14:36:55.084255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtknd" event={"ID":"9f3f2a62-c0d7-450c-85d8-afd52aecc899","Type":"ContainerStarted","Data":"b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d"} Oct 03 14:36:55 crc kubenswrapper[4962]: I1003 14:36:55.115960 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dtknd" podStartSLOduration=2.693393387 podStartE2EDuration="6.115945113s" podCreationTimestamp="2025-10-03 14:36:49 +0000 UTC" firstStartedPulling="2025-10-03 14:36:51.04505109 +0000 UTC m=+6419.448948915" lastFinishedPulling="2025-10-03 14:36:54.467602796 +0000 UTC m=+6422.871500641" observedRunningTime="2025-10-03 14:36:55.10795256 +0000 UTC m=+6423.511850395" watchObservedRunningTime="2025-10-03 14:36:55.115945113 +0000 UTC m=+6423.519842948" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.478313 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b"] Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.481504 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.488186 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.490247 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b"] Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.554566 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.554676 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4hll\" (UniqueName: \"kubernetes.io/projected/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-kube-api-access-m4hll\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.554806 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.657362 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.657431 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4hll\" (UniqueName: \"kubernetes.io/projected/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-kube-api-access-m4hll\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.657479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.657950 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.658022 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.675970 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4hll\" (UniqueName: \"kubernetes.io/projected/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-kube-api-access-m4hll\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:57 crc kubenswrapper[4962]: I1003 14:36:57.803159 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:36:58 crc kubenswrapper[4962]: I1003 14:36:58.064622 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qc5nh"] Oct 03 14:36:58 crc kubenswrapper[4962]: I1003 14:36:58.075124 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qc5nh"] Oct 03 14:36:58 crc kubenswrapper[4962]: I1003 14:36:58.238308 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13977e9-fabe-4c69-a8dc-18f841d73e6c" path="/var/lib/kubelet/pods/d13977e9-fabe-4c69-a8dc-18f841d73e6c/volumes" Oct 03 14:36:58 crc kubenswrapper[4962]: W1003 14:36:58.385219 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96ddfe55_8a74_4f1b_96d6_5da6bfeb8e6f.slice/crio-419e4b10727acab253ef37e821550df1acdb4352c1028c8b54ff0f2b47b7a339 WatchSource:0}: Error finding container 419e4b10727acab253ef37e821550df1acdb4352c1028c8b54ff0f2b47b7a339: Status 404 returned error can't find the container with id 419e4b10727acab253ef37e821550df1acdb4352c1028c8b54ff0f2b47b7a339 Oct 03 14:36:58 crc kubenswrapper[4962]: I1003 14:36:58.386201 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b"] Oct 03 14:36:59 crc kubenswrapper[4962]: I1003 14:36:59.120924 4962 generic.go:334] "Generic (PLEG): container finished" podID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerID="002dff5f92c42a5109b8126932433a7fdf779bcf20408256974efefb3241f898" exitCode=0 Oct 03 14:36:59 crc kubenswrapper[4962]: I1003 14:36:59.120964 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" event={"ID":"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f","Type":"ContainerDied","Data":"002dff5f92c42a5109b8126932433a7fdf779bcf20408256974efefb3241f898"} Oct 03 14:36:59 crc kubenswrapper[4962]: I1003 14:36:59.122373 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" event={"ID":"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f","Type":"ContainerStarted","Data":"419e4b10727acab253ef37e821550df1acdb4352c1028c8b54ff0f2b47b7a339"} Oct 03 14:36:59 crc kubenswrapper[4962]: I1003 14:36:59.971271 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:36:59 crc kubenswrapper[4962]: I1003 14:36:59.971319 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:37:00 crc kubenswrapper[4962]: I1003 14:37:00.026014 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:37:00 crc kubenswrapper[4962]: I1003 14:37:00.184399 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.156957 4962 generic.go:334] "Generic (PLEG): container finished" podID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerID="ed2933061aae34e845c45227bad5ac27f12e50fd1da1c064719e5ac71c2fb89f" exitCode=0 Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.157005 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" event={"ID":"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f","Type":"ContainerDied","Data":"ed2933061aae34e845c45227bad5ac27f12e50fd1da1c064719e5ac71c2fb89f"} Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.430058 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dtknd"] Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.430667 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dtknd" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerName="registry-server" containerID="cri-o://b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d" gracePeriod=2 Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.873725 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.971873 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-utilities\") pod \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.972087 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-catalog-content\") pod \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.972134 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27stn\" (UniqueName: \"kubernetes.io/projected/9f3f2a62-c0d7-450c-85d8-afd52aecc899-kube-api-access-27stn\") pod \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\" (UID: \"9f3f2a62-c0d7-450c-85d8-afd52aecc899\") " Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.972747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-utilities" (OuterVolumeSpecName: "utilities") pod "9f3f2a62-c0d7-450c-85d8-afd52aecc899" (UID: "9f3f2a62-c0d7-450c-85d8-afd52aecc899"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:37:02 crc kubenswrapper[4962]: I1003 14:37:02.979879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3f2a62-c0d7-450c-85d8-afd52aecc899-kube-api-access-27stn" (OuterVolumeSpecName: "kube-api-access-27stn") pod "9f3f2a62-c0d7-450c-85d8-afd52aecc899" (UID: "9f3f2a62-c0d7-450c-85d8-afd52aecc899"). InnerVolumeSpecName "kube-api-access-27stn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.044327 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f3f2a62-c0d7-450c-85d8-afd52aecc899" (UID: "9f3f2a62-c0d7-450c-85d8-afd52aecc899"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.074399 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.074435 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27stn\" (UniqueName: \"kubernetes.io/projected/9f3f2a62-c0d7-450c-85d8-afd52aecc899-kube-api-access-27stn\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.074445 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f3f2a62-c0d7-450c-85d8-afd52aecc899-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.167231 4962 generic.go:334] "Generic (PLEG): container finished" podID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerID="075cce827969d780292c390152f3b5b208d61cbf2061ebbf802cc372cf17e0f3" exitCode=0 Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.167332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" event={"ID":"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f","Type":"ContainerDied","Data":"075cce827969d780292c390152f3b5b208d61cbf2061ebbf802cc372cf17e0f3"} Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.171279 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerID="b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d" exitCode=0 Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.171322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtknd" event={"ID":"9f3f2a62-c0d7-450c-85d8-afd52aecc899","Type":"ContainerDied","Data":"b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d"} Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.171344 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dtknd" event={"ID":"9f3f2a62-c0d7-450c-85d8-afd52aecc899","Type":"ContainerDied","Data":"d35f940c67545f1ba36a9b1cd63989ad545160a5a71b55f42bc69733c26bb1b4"} Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.171360 4962 scope.go:117] "RemoveContainer" containerID="b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.171497 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dtknd" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.206605 4962 scope.go:117] "RemoveContainer" containerID="51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.208749 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dtknd"] Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.218436 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dtknd"] Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.227210 4962 scope.go:117] "RemoveContainer" containerID="46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.275162 4962 scope.go:117] "RemoveContainer" containerID="b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d" Oct 03 14:37:03 crc kubenswrapper[4962]: E1003 14:37:03.275691 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d\": container with ID starting with b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d not found: ID does not exist" containerID="b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.275729 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d"} err="failed to get container status \"b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d\": rpc error: code = NotFound desc = could not find container \"b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d\": container with ID starting with b19fc2cbc60b692167c1d5190c2d9e9f19b32a1e87af1ba26b9c9bfef1e4a59d not found: ID does not exist" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.275755 4962 scope.go:117] "RemoveContainer" containerID="51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93" Oct 03 14:37:03 crc kubenswrapper[4962]: E1003 14:37:03.275999 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93\": container with ID starting with 51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93 not found: ID does not exist" containerID="51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.276020 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93"} err="failed to get container status \"51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93\": rpc error: code = NotFound desc = could not find container \"51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93\": container with ID starting with 51f827b85aeb19f9b14e92d585fbc620f9cb534c20a301e6fb17984d91894c93 not found: ID does not exist" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.276038 4962 scope.go:117] "RemoveContainer" containerID="46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04" Oct 03 14:37:03 crc kubenswrapper[4962]: E1003 14:37:03.276277 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04\": container with ID starting with 46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04 not found: ID does not exist" containerID="46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04" Oct 03 14:37:03 crc kubenswrapper[4962]: I1003 14:37:03.276329 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04"} err="failed to get container status \"46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04\": rpc error: code = NotFound desc = could not find container \"46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04\": container with ID starting with 46f6cb21643b8e2b9bca7121dc5bba6fce71001ddbb547eed12bc2703ea4ba04 not found: ID does not exist" Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.238745 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" path="/var/lib/kubelet/pods/9f3f2a62-c0d7-450c-85d8-afd52aecc899/volumes" Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.525818 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.603209 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4hll\" (UniqueName: \"kubernetes.io/projected/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-kube-api-access-m4hll\") pod \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.603328 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-util\") pod \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.603456 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-bundle\") pod \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\" (UID: \"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f\") " Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.606127 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-bundle" (OuterVolumeSpecName: "bundle") pod "96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" (UID: "96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.614913 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-kube-api-access-m4hll" (OuterVolumeSpecName: "kube-api-access-m4hll") pod "96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" (UID: "96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f"). InnerVolumeSpecName "kube-api-access-m4hll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.615663 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-util" (OuterVolumeSpecName: "util") pod "96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" (UID: "96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.705633 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-util\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.705682 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:04 crc kubenswrapper[4962]: I1003 14:37:04.705692 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4hll\" (UniqueName: \"kubernetes.io/projected/96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f-kube-api-access-m4hll\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:05 crc kubenswrapper[4962]: I1003 14:37:05.192404 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" event={"ID":"96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f","Type":"ContainerDied","Data":"419e4b10727acab253ef37e821550df1acdb4352c1028c8b54ff0f2b47b7a339"} Oct 03 14:37:05 crc kubenswrapper[4962]: I1003 14:37:05.192438 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419e4b10727acab253ef37e821550df1acdb4352c1028c8b54ff0f2b47b7a339" Oct 03 14:37:05 crc kubenswrapper[4962]: I1003 14:37:05.192465 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b" Oct 03 14:37:11 crc kubenswrapper[4962]: I1003 14:37:11.507187 4962 scope.go:117] "RemoveContainer" containerID="53f5ba95097b5b29609cca88f6f3efa13dfa1340cfb71dc5f503acaa90d3118b" Oct 03 14:37:11 crc kubenswrapper[4962]: I1003 14:37:11.531730 4962 scope.go:117] "RemoveContainer" containerID="06f8a0e8d62bca357dedef5d867dfbe054c77020f2a02096f7d8a89930584f77" Oct 03 14:37:11 crc kubenswrapper[4962]: I1003 14:37:11.606246 4962 scope.go:117] "RemoveContainer" containerID="2c8060d6ad024cbfc6ed48bdca3d180d8f2b87f439a1208004eeddff9c7aa97c" Oct 03 14:37:11 crc kubenswrapper[4962]: I1003 14:37:11.654588 4962 scope.go:117] "RemoveContainer" containerID="15b6ba7acc08c937f4afe7c75cba7bb92208c523af552bffd7588150c9a439e1" Oct 03 14:37:11 crc kubenswrapper[4962]: I1003 14:37:11.723892 4962 scope.go:117] "RemoveContainer" containerID="16779953fa726f8ce1e3ad8375f5fcbe2341d0d825b43a044b374ccba5cd87c0" Oct 03 14:37:11 crc kubenswrapper[4962]: I1003 14:37:11.748186 4962 scope.go:117] "RemoveContainer" containerID="f554efcb536bb4eceaaf716ce4565fc501591bb77fa5cc344c79e6d980d9d6ee" Oct 03 14:37:11 crc kubenswrapper[4962]: I1003 14:37:11.795321 4962 scope.go:117] "RemoveContainer" containerID="c702f3912c0c0cff4948df79d888ae7553e08c96d52bdd60e2342cfbe37f4d9a" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.007906 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk"] Oct 03 14:37:13 crc kubenswrapper[4962]: E1003 14:37:13.008566 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerName="extract-content" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.008580 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerName="extract-content" Oct 03 14:37:13 crc kubenswrapper[4962]: E1003 14:37:13.008586 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerName="extract" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.008594 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerName="extract" Oct 03 14:37:13 crc kubenswrapper[4962]: E1003 14:37:13.008604 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerName="registry-server" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.008625 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerName="registry-server" Oct 03 14:37:13 crc kubenswrapper[4962]: E1003 14:37:13.008661 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerName="pull" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.008668 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerName="pull" Oct 03 14:37:13 crc kubenswrapper[4962]: E1003 14:37:13.008693 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerName="extract-utilities" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.008699 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerName="extract-utilities" Oct 03 14:37:13 crc kubenswrapper[4962]: E1003 14:37:13.008706 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerName="util" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.008712 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerName="util" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.008890 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3f2a62-c0d7-450c-85d8-afd52aecc899" containerName="registry-server" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.008911 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f" containerName="extract" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.009577 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.014182 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.014267 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.019449 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.020826 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-pgbch" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.072560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkgfc\" (UniqueName: \"kubernetes.io/projected/266013b3-c058-4cc7-aa50-b02f13810284-kube-api-access-hkgfc\") pod \"obo-prometheus-operator-7c8cf85677-f4nnk\" (UID: \"266013b3-c058-4cc7-aa50-b02f13810284\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.135964 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.137559 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.139208 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.150014 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.152026 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.155778 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-n7trs" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.164762 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.175600 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1985af21-3784-432c-9a82-0a71b8c5f830-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-wslh7\" (UID: \"1985af21-3784-432c-9a82-0a71b8c5f830\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.175749 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7fa44c96-655f-4152-8681-0c8090139b68-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-p7khz\" (UID: \"7fa44c96-655f-4152-8681-0c8090139b68\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.175828 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7fa44c96-655f-4152-8681-0c8090139b68-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-p7khz\" (UID: \"7fa44c96-655f-4152-8681-0c8090139b68\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.175980 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1985af21-3784-432c-9a82-0a71b8c5f830-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-wslh7\" (UID: \"1985af21-3784-432c-9a82-0a71b8c5f830\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.176035 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkgfc\" (UniqueName: \"kubernetes.io/projected/266013b3-c058-4cc7-aa50-b02f13810284-kube-api-access-hkgfc\") pod \"obo-prometheus-operator-7c8cf85677-f4nnk\" (UID: \"266013b3-c058-4cc7-aa50-b02f13810284\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.184844 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.211734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkgfc\" (UniqueName: \"kubernetes.io/projected/266013b3-c058-4cc7-aa50-b02f13810284-kube-api-access-hkgfc\") pod \"obo-prometheus-operator-7c8cf85677-f4nnk\" (UID: \"266013b3-c058-4cc7-aa50-b02f13810284\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.278905 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1985af21-3784-432c-9a82-0a71b8c5f830-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-wslh7\" (UID: \"1985af21-3784-432c-9a82-0a71b8c5f830\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.279410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1985af21-3784-432c-9a82-0a71b8c5f830-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-wslh7\" (UID: \"1985af21-3784-432c-9a82-0a71b8c5f830\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.279536 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7fa44c96-655f-4152-8681-0c8090139b68-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-p7khz\" (UID: \"7fa44c96-655f-4152-8681-0c8090139b68\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.279692 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7fa44c96-655f-4152-8681-0c8090139b68-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-p7khz\" (UID: \"7fa44c96-655f-4152-8681-0c8090139b68\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.293124 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1985af21-3784-432c-9a82-0a71b8c5f830-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-wslh7\" (UID: \"1985af21-3784-432c-9a82-0a71b8c5f830\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.294551 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1985af21-3784-432c-9a82-0a71b8c5f830-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-wslh7\" (UID: \"1985af21-3784-432c-9a82-0a71b8c5f830\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.296816 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7fa44c96-655f-4152-8681-0c8090139b68-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-p7khz\" (UID: \"7fa44c96-655f-4152-8681-0c8090139b68\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.306729 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7fa44c96-655f-4152-8681-0c8090139b68-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5996f7f648-p7khz\" (UID: \"7fa44c96-655f-4152-8681-0c8090139b68\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.330816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.348802 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-kmgdq"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.351001 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.359442 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.360116 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-nrpxs" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.367072 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-kmgdq"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.382862 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g4ff\" (UniqueName: \"kubernetes.io/projected/f5832b59-1c49-493e-a996-b83f81d1e279-kube-api-access-8g4ff\") pod \"observability-operator-cc5f78dfc-kmgdq\" (UID: \"f5832b59-1c49-493e-a996-b83f81d1e279\") " pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.382958 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5832b59-1c49-493e-a996-b83f81d1e279-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-kmgdq\" (UID: \"f5832b59-1c49-493e-a996-b83f81d1e279\") " pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.461810 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.479948 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.498695 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g4ff\" (UniqueName: \"kubernetes.io/projected/f5832b59-1c49-493e-a996-b83f81d1e279-kube-api-access-8g4ff\") pod \"observability-operator-cc5f78dfc-kmgdq\" (UID: \"f5832b59-1c49-493e-a996-b83f81d1e279\") " pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.499816 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5832b59-1c49-493e-a996-b83f81d1e279-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-kmgdq\" (UID: \"f5832b59-1c49-493e-a996-b83f81d1e279\") " pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.531859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g4ff\" (UniqueName: \"kubernetes.io/projected/f5832b59-1c49-493e-a996-b83f81d1e279-kube-api-access-8g4ff\") pod \"observability-operator-cc5f78dfc-kmgdq\" (UID: \"f5832b59-1c49-493e-a996-b83f81d1e279\") " pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.533488 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5832b59-1c49-493e-a996-b83f81d1e279-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-kmgdq\" (UID: \"f5832b59-1c49-493e-a996-b83f81d1e279\") " pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.661596 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-6zl9z"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.669481 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.674767 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-74dkl" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.678928 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-6zl9z"] Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.714043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/02a8ad14-f79e-44dc-8c54-31e3702b287c-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-6zl9z\" (UID: \"02a8ad14-f79e-44dc-8c54-31e3702b287c\") " pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.714167 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g2ql\" (UniqueName: \"kubernetes.io/projected/02a8ad14-f79e-44dc-8c54-31e3702b287c-kube-api-access-7g2ql\") pod \"perses-operator-54bc95c9fb-6zl9z\" (UID: \"02a8ad14-f79e-44dc-8c54-31e3702b287c\") " pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.732330 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.824973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g2ql\" (UniqueName: \"kubernetes.io/projected/02a8ad14-f79e-44dc-8c54-31e3702b287c-kube-api-access-7g2ql\") pod \"perses-operator-54bc95c9fb-6zl9z\" (UID: \"02a8ad14-f79e-44dc-8c54-31e3702b287c\") " pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.825114 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/02a8ad14-f79e-44dc-8c54-31e3702b287c-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-6zl9z\" (UID: \"02a8ad14-f79e-44dc-8c54-31e3702b287c\") " pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.826608 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/02a8ad14-f79e-44dc-8c54-31e3702b287c-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-6zl9z\" (UID: \"02a8ad14-f79e-44dc-8c54-31e3702b287c\") " pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:13 crc kubenswrapper[4962]: I1003 14:37:13.851223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g2ql\" (UniqueName: \"kubernetes.io/projected/02a8ad14-f79e-44dc-8c54-31e3702b287c-kube-api-access-7g2ql\") pod \"perses-operator-54bc95c9fb-6zl9z\" (UID: \"02a8ad14-f79e-44dc-8c54-31e3702b287c\") " pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.018408 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.036457 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk"] Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.180285 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7"] Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.208160 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz"] Oct 03 14:37:14 crc kubenswrapper[4962]: W1003 14:37:14.219061 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa44c96_655f_4152_8681_0c8090139b68.slice/crio-2976747075bfd287122376d9928f7713a6338318bfb13c2f5e173d51909c2c0e WatchSource:0}: Error finding container 2976747075bfd287122376d9928f7713a6338318bfb13c2f5e173d51909c2c0e: Status 404 returned error can't find the container with id 2976747075bfd287122376d9928f7713a6338318bfb13c2f5e173d51909c2c0e Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.299587 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" event={"ID":"1985af21-3784-432c-9a82-0a71b8c5f830","Type":"ContainerStarted","Data":"0ba5b9c2907919af175e96e61f442459d9809eec6f208385815851739b71bfdc"} Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.301738 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" event={"ID":"7fa44c96-655f-4152-8681-0c8090139b68","Type":"ContainerStarted","Data":"2976747075bfd287122376d9928f7713a6338318bfb13c2f5e173d51909c2c0e"} Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.304040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk" event={"ID":"266013b3-c058-4cc7-aa50-b02f13810284","Type":"ContainerStarted","Data":"00976bcf2314f5fcaecce0dfcab9418151aa46a28a83beb5242778cf5bca330e"} Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.340271 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-kmgdq"] Oct 03 14:37:14 crc kubenswrapper[4962]: W1003 14:37:14.343798 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5832b59_1c49_493e_a996_b83f81d1e279.slice/crio-0477133eb4f269cf8763644cc613bad152950d5c2ae30bcccf8d118990f70573 WatchSource:0}: Error finding container 0477133eb4f269cf8763644cc613bad152950d5c2ae30bcccf8d118990f70573: Status 404 returned error can't find the container with id 0477133eb4f269cf8763644cc613bad152950d5c2ae30bcccf8d118990f70573 Oct 03 14:37:14 crc kubenswrapper[4962]: I1003 14:37:14.527695 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-6zl9z"] Oct 03 14:37:14 crc kubenswrapper[4962]: W1003 14:37:14.532671 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a8ad14_f79e_44dc_8c54_31e3702b287c.slice/crio-eca481c0b57a6e69869736f5ebcd808a257493e4f16a65a8343a0d9fac76dd04 WatchSource:0}: Error finding container eca481c0b57a6e69869736f5ebcd808a257493e4f16a65a8343a0d9fac76dd04: Status 404 returned error can't find the container with id eca481c0b57a6e69869736f5ebcd808a257493e4f16a65a8343a0d9fac76dd04 Oct 03 14:37:15 crc kubenswrapper[4962]: I1003 14:37:15.324624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" event={"ID":"02a8ad14-f79e-44dc-8c54-31e3702b287c","Type":"ContainerStarted","Data":"eca481c0b57a6e69869736f5ebcd808a257493e4f16a65a8343a0d9fac76dd04"} Oct 03 14:37:15 crc kubenswrapper[4962]: I1003 14:37:15.330070 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" event={"ID":"f5832b59-1c49-493e-a996-b83f81d1e279","Type":"ContainerStarted","Data":"0477133eb4f269cf8763644cc613bad152950d5c2ae30bcccf8d118990f70573"} Oct 03 14:37:16 crc kubenswrapper[4962]: I1003 14:37:16.087082 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9d2p8"] Oct 03 14:37:16 crc kubenswrapper[4962]: I1003 14:37:16.105420 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9d2p8"] Oct 03 14:37:16 crc kubenswrapper[4962]: I1003 14:37:16.256055 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eccc4804-3bd4-428b-9ff6-cd364d7f61b9" path="/var/lib/kubelet/pods/eccc4804-3bd4-428b-9ff6-cd364d7f61b9/volumes" Oct 03 14:37:17 crc kubenswrapper[4962]: I1003 14:37:17.047684 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bvvxc"] Oct 03 14:37:17 crc kubenswrapper[4962]: I1003 14:37:17.057771 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bvvxc"] Oct 03 14:37:18 crc kubenswrapper[4962]: I1003 14:37:18.242119 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8608a2df-334e-4b2c-a93d-05276e2afe0f" path="/var/lib/kubelet/pods/8608a2df-334e-4b2c-a93d-05276e2afe0f/volumes" Oct 03 14:37:24 crc kubenswrapper[4962]: I1003 14:37:24.661342 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:37:24 crc kubenswrapper[4962]: I1003 14:37:24.662033 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.472011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" event={"ID":"f5832b59-1c49-493e-a996-b83f81d1e279","Type":"ContainerStarted","Data":"7751dd179e11237ad8db773039574ee78bd380a034b4c5230afd20572e777f31"} Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.475077 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.475673 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.476984 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk" event={"ID":"266013b3-c058-4cc7-aa50-b02f13810284","Type":"ContainerStarted","Data":"99c0b2c7ba15d12ea6b332106b95c769d3f8b7d307bf7fe0b0d1ec29682012ab"} Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.478561 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" event={"ID":"02a8ad14-f79e-44dc-8c54-31e3702b287c","Type":"ContainerStarted","Data":"1e1dcb23223888edecb6af1e407b4e25dea423b77469ac8c23ee5dea9c24f7ac"} Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.480368 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" event={"ID":"1985af21-3784-432c-9a82-0a71b8c5f830","Type":"ContainerStarted","Data":"164e3a04e8f1d3cbc26646d5201427305ef8caf5c645381848cbcc16f6195bd4"} Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.482047 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" event={"ID":"7fa44c96-655f-4152-8681-0c8090139b68","Type":"ContainerStarted","Data":"9bc6d476910506fdb7b5ade4d83b482545b7060df4c3e07d287e12a90f569302"} Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.502989 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-kmgdq" podStartSLOduration=2.11359154 podStartE2EDuration="12.502964407s" podCreationTimestamp="2025-10-03 14:37:13 +0000 UTC" firstStartedPulling="2025-10-03 14:37:14.356770299 +0000 UTC m=+6442.760668124" lastFinishedPulling="2025-10-03 14:37:24.746143156 +0000 UTC m=+6453.150040991" observedRunningTime="2025-10-03 14:37:25.496814993 +0000 UTC m=+6453.900712848" watchObservedRunningTime="2025-10-03 14:37:25.502964407 +0000 UTC m=+6453.906862242" Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.532691 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" podStartSLOduration=2.459670713 podStartE2EDuration="12.532611818s" podCreationTimestamp="2025-10-03 14:37:13 +0000 UTC" firstStartedPulling="2025-10-03 14:37:14.535490917 +0000 UTC m=+6442.939388752" lastFinishedPulling="2025-10-03 14:37:24.608432022 +0000 UTC m=+6453.012329857" observedRunningTime="2025-10-03 14:37:25.529213457 +0000 UTC m=+6453.933111292" watchObservedRunningTime="2025-10-03 14:37:25.532611818 +0000 UTC m=+6453.936509653" Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.549367 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-f4nnk" podStartSLOduration=3.005607098 podStartE2EDuration="13.549350764s" podCreationTimestamp="2025-10-03 14:37:12 +0000 UTC" firstStartedPulling="2025-10-03 14:37:14.058665746 +0000 UTC m=+6442.462563581" lastFinishedPulling="2025-10-03 14:37:24.602409392 +0000 UTC m=+6453.006307247" observedRunningTime="2025-10-03 14:37:25.54807854 +0000 UTC m=+6453.951976395" watchObservedRunningTime="2025-10-03 14:37:25.549350764 +0000 UTC m=+6453.953248599" Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.621820 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-p7khz" podStartSLOduration=2.243045093 podStartE2EDuration="12.621801627s" podCreationTimestamp="2025-10-03 14:37:13 +0000 UTC" firstStartedPulling="2025-10-03 14:37:14.223051402 +0000 UTC m=+6442.626949237" lastFinishedPulling="2025-10-03 14:37:24.601807946 +0000 UTC m=+6453.005705771" observedRunningTime="2025-10-03 14:37:25.617544504 +0000 UTC m=+6454.021442359" watchObservedRunningTime="2025-10-03 14:37:25.621801627 +0000 UTC m=+6454.025699462" Oct 03 14:37:25 crc kubenswrapper[4962]: I1003 14:37:25.660218 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5996f7f648-wslh7" podStartSLOduration=2.264533715 podStartE2EDuration="12.660195231s" podCreationTimestamp="2025-10-03 14:37:13 +0000 UTC" firstStartedPulling="2025-10-03 14:37:14.205354809 +0000 UTC m=+6442.609252644" lastFinishedPulling="2025-10-03 14:37:24.601016315 +0000 UTC m=+6453.004914160" observedRunningTime="2025-10-03 14:37:25.643343482 +0000 UTC m=+6454.047241327" watchObservedRunningTime="2025-10-03 14:37:25.660195231 +0000 UTC m=+6454.064093066" Oct 03 14:37:26 crc kubenswrapper[4962]: I1003 14:37:26.491584 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:30 crc kubenswrapper[4962]: I1003 14:37:30.027219 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-lnpxv"] Oct 03 14:37:30 crc kubenswrapper[4962]: I1003 14:37:30.036980 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-lnpxv"] Oct 03 14:37:30 crc kubenswrapper[4962]: I1003 14:37:30.238914 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c1b2832-4711-4253-98c5-f8b543b55c80" path="/var/lib/kubelet/pods/0c1b2832-4711-4253-98c5-f8b543b55c80/volumes" Oct 03 14:37:34 crc kubenswrapper[4962]: I1003 14:37:34.023370 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-6zl9z" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.425560 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.426265 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c583a81c-2241-4f3f-a190-2b90cff0b4db" containerName="openstackclient" containerID="cri-o://71817b2098a855f7b56344bd1399519c0ace5645fa5371c1eda128acb86a9434" gracePeriod=2 Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.441607 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.463763 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 14:37:36 crc kubenswrapper[4962]: E1003 14:37:36.464358 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c583a81c-2241-4f3f-a190-2b90cff0b4db" containerName="openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.464377 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c583a81c-2241-4f3f-a190-2b90cff0b4db" containerName="openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.464580 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c583a81c-2241-4f3f-a190-2b90cff0b4db" containerName="openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.465473 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.473576 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.492992 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c583a81c-2241-4f3f-a190-2b90cff0b4db" podUID="4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.614009 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-openstack-config\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.614165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwrm\" (UniqueName: \"kubernetes.io/projected/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-kube-api-access-ggwrm\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.614536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-openstack-config-secret\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.664481 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.666124 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.672157 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dwrk8" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.716659 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-openstack-config\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.716763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwrm\" (UniqueName: \"kubernetes.io/projected/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-kube-api-access-ggwrm\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.716874 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-openstack-config-secret\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.718113 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-openstack-config\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.725281 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-openstack-config-secret\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.742415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwrm\" (UniqueName: \"kubernetes.io/projected/4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812-kube-api-access-ggwrm\") pod \"openstackclient\" (UID: \"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812\") " pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.794759 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.799180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.822845 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whlb5\" (UniqueName: \"kubernetes.io/projected/0444f80d-1176-4ee9-963d-b3d29af93dea-kube-api-access-whlb5\") pod \"kube-state-metrics-0\" (UID: \"0444f80d-1176-4ee9-963d-b3d29af93dea\") " pod="openstack/kube-state-metrics-0" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.927876 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whlb5\" (UniqueName: \"kubernetes.io/projected/0444f80d-1176-4ee9-963d-b3d29af93dea-kube-api-access-whlb5\") pod \"kube-state-metrics-0\" (UID: \"0444f80d-1176-4ee9-963d-b3d29af93dea\") " pod="openstack/kube-state-metrics-0" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.954544 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whlb5\" (UniqueName: \"kubernetes.io/projected/0444f80d-1176-4ee9-963d-b3d29af93dea-kube-api-access-whlb5\") pod \"kube-state-metrics-0\" (UID: \"0444f80d-1176-4ee9-963d-b3d29af93dea\") " pod="openstack/kube-state-metrics-0" Oct 03 14:37:36 crc kubenswrapper[4962]: I1003 14:37:36.988131 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:37:37 crc kubenswrapper[4962]: I1003 14:37:37.688025 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 14:37:37 crc kubenswrapper[4962]: I1003 14:37:37.926777 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:37:37 crc kubenswrapper[4962]: I1003 14:37:37.973587 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 14:37:37 crc kubenswrapper[4962]: I1003 14:37:37.990120 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:37 crc kubenswrapper[4962]: I1003 14:37:37.994571 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-wfd29" Oct 03 14:37:37 crc kubenswrapper[4962]: I1003 14:37:37.994758 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 03 14:37:37 crc kubenswrapper[4962]: I1003 14:37:37.994786 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 03 14:37:37 crc kubenswrapper[4962]: I1003 14:37:37.994903 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.003374 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.084860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/147c89e8-3065-48f5-ae75-cb029cd4f447-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.084916 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/147c89e8-3065-48f5-ae75-cb029cd4f447-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.084940 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/147c89e8-3065-48f5-ae75-cb029cd4f447-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.084976 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmwtx\" (UniqueName: \"kubernetes.io/projected/147c89e8-3065-48f5-ae75-cb029cd4f447-kube-api-access-hmwtx\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.085023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/147c89e8-3065-48f5-ae75-cb029cd4f447-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.085037 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/147c89e8-3065-48f5-ae75-cb029cd4f447-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.186461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/147c89e8-3065-48f5-ae75-cb029cd4f447-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.186531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/147c89e8-3065-48f5-ae75-cb029cd4f447-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.186572 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/147c89e8-3065-48f5-ae75-cb029cd4f447-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.187250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmwtx\" (UniqueName: \"kubernetes.io/projected/147c89e8-3065-48f5-ae75-cb029cd4f447-kube-api-access-hmwtx\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.187362 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/147c89e8-3065-48f5-ae75-cb029cd4f447-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.187404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/147c89e8-3065-48f5-ae75-cb029cd4f447-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.189115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/147c89e8-3065-48f5-ae75-cb029cd4f447-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.198313 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/147c89e8-3065-48f5-ae75-cb029cd4f447-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.199089 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/147c89e8-3065-48f5-ae75-cb029cd4f447-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.207500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/147c89e8-3065-48f5-ae75-cb029cd4f447-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.247501 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmwtx\" (UniqueName: \"kubernetes.io/projected/147c89e8-3065-48f5-ae75-cb029cd4f447-kube-api-access-hmwtx\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.249970 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/147c89e8-3065-48f5-ae75-cb029cd4f447-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"147c89e8-3065-48f5-ae75-cb029cd4f447\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.362908 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.618965 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0444f80d-1176-4ee9-963d-b3d29af93dea","Type":"ContainerStarted","Data":"b67a3d17a1fa324fb9b54eb7c598afd0f710725134f11e60954109bd5b37dabf"} Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.623433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812","Type":"ContainerStarted","Data":"d1f3671c73d79f491d5af7d166249522fcc2f2ddbf95a8771b60ab9b6a2029d5"} Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.623587 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812","Type":"ContainerStarted","Data":"d78f872b067fe688e37f90f613fcfbec2072ee8f06c2172562315aedef10d90a"} Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.625883 4962 generic.go:334] "Generic (PLEG): container finished" podID="c583a81c-2241-4f3f-a190-2b90cff0b4db" containerID="71817b2098a855f7b56344bd1399519c0ace5645fa5371c1eda128acb86a9434" exitCode=137 Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.681747 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.684567 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.691394 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.692992 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.693469 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.693544 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-2bhxp" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.693656 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.693704 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.715921 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.71590141 podStartE2EDuration="2.71590141s" podCreationTimestamp="2025-10-03 14:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:37:38.662169967 +0000 UTC m=+6467.066067792" watchObservedRunningTime="2025-10-03 14:37:38.71590141 +0000 UTC m=+6467.119799245" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.747134 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.748128 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.748453 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.760086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbc47\" (UniqueName: \"kubernetes.io/projected/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-kube-api-access-tbc47\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.760221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-31eb030d-7561-48f8-975b-31ea0739af63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31eb030d-7561-48f8-975b-31ea0739af63\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.760336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.760424 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.760558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-config\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.755744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.863540 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.864940 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.865022 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.865080 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbc47\" (UniqueName: \"kubernetes.io/projected/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-kube-api-access-tbc47\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.865131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-31eb030d-7561-48f8-975b-31ea0739af63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31eb030d-7561-48f8-975b-31ea0739af63\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.865167 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.865207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.865284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-config\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.865472 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.874545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-config\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.877463 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.881002 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.881263 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.885921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.965768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbc47\" (UniqueName: \"kubernetes.io/projected/44378bf5-5a0a-4e9f-98c9-7bd8b42387da-kube-api-access-tbc47\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.967558 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:37:38 crc kubenswrapper[4962]: I1003 14:37:38.967615 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-31eb030d-7561-48f8-975b-31ea0739af63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31eb030d-7561-48f8-975b-31ea0739af63\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2a235d15215619f9311a7e7e96882dbf19de2c3b96c19c1665a53cada3f5e6ea/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.097238 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.274452 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-31eb030d-7561-48f8-975b-31ea0739af63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31eb030d-7561-48f8-975b-31ea0739af63\") pod \"prometheus-metric-storage-0\" (UID: \"44378bf5-5a0a-4e9f-98c9-7bd8b42387da\") " pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.377379 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.378421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config-secret\") pod \"c583a81c-2241-4f3f-a190-2b90cff0b4db\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.378487 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config\") pod \"c583a81c-2241-4f3f-a190-2b90cff0b4db\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.378733 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jpqk\" (UniqueName: \"kubernetes.io/projected/c583a81c-2241-4f3f-a190-2b90cff0b4db-kube-api-access-6jpqk\") pod \"c583a81c-2241-4f3f-a190-2b90cff0b4db\" (UID: \"c583a81c-2241-4f3f-a190-2b90cff0b4db\") " Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.407415 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c583a81c-2241-4f3f-a190-2b90cff0b4db-kube-api-access-6jpqk" (OuterVolumeSpecName: "kube-api-access-6jpqk") pod "c583a81c-2241-4f3f-a190-2b90cff0b4db" (UID: "c583a81c-2241-4f3f-a190-2b90cff0b4db"). InnerVolumeSpecName "kube-api-access-6jpqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.414942 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.456345 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c583a81c-2241-4f3f-a190-2b90cff0b4db" (UID: "c583a81c-2241-4f3f-a190-2b90cff0b4db"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.484486 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jpqk\" (UniqueName: \"kubernetes.io/projected/c583a81c-2241-4f3f-a190-2b90cff0b4db-kube-api-access-6jpqk\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.484832 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.561547 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c583a81c-2241-4f3f-a190-2b90cff0b4db" (UID: "c583a81c-2241-4f3f-a190-2b90cff0b4db"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.586525 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c583a81c-2241-4f3f-a190-2b90cff0b4db-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.640988 4962 scope.go:117] "RemoveContainer" containerID="71817b2098a855f7b56344bd1399519c0ace5645fa5371c1eda128acb86a9434" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.641202 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.661149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0444f80d-1176-4ee9-963d-b3d29af93dea","Type":"ContainerStarted","Data":"c42fce36ab0730507f9d7f0b0d7e54cec1c519f2116b915897b202048dc20b78"} Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.662386 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.672962 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"147c89e8-3065-48f5-ae75-cb029cd4f447","Type":"ContainerStarted","Data":"7895955e890993dce32ad218ee7d1f69aeed0a4e3d681a79660cc6ae18c262af"} Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.704432 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c583a81c-2241-4f3f-a190-2b90cff0b4db" podUID="4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812" Oct 03 14:37:39 crc kubenswrapper[4962]: I1003 14:37:39.719584 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.238614735 podStartE2EDuration="3.719564986s" podCreationTimestamp="2025-10-03 14:37:36 +0000 UTC" firstStartedPulling="2025-10-03 14:37:37.991634648 +0000 UTC m=+6466.395532483" lastFinishedPulling="2025-10-03 14:37:38.472584899 +0000 UTC m=+6466.876482734" observedRunningTime="2025-10-03 14:37:39.697033525 +0000 UTC m=+6468.100931360" watchObservedRunningTime="2025-10-03 14:37:39.719564986 +0000 UTC m=+6468.123462821" Oct 03 14:37:40 crc kubenswrapper[4962]: I1003 14:37:40.128809 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 14:37:40 crc kubenswrapper[4962]: I1003 14:37:40.247938 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c583a81c-2241-4f3f-a190-2b90cff0b4db" path="/var/lib/kubelet/pods/c583a81c-2241-4f3f-a190-2b90cff0b4db/volumes" Oct 03 14:37:40 crc kubenswrapper[4962]: I1003 14:37:40.680956 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44378bf5-5a0a-4e9f-98c9-7bd8b42387da","Type":"ContainerStarted","Data":"8cb1f7e40922bb60be73e14a0c5a3b178ecde07fc83558469369d9f04616a7f4"} Oct 03 14:37:45 crc kubenswrapper[4962]: I1003 14:37:45.726992 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44378bf5-5a0a-4e9f-98c9-7bd8b42387da","Type":"ContainerStarted","Data":"94ee6bfa2116a4517d6b8f9e4401a7aeb3c4ad9e15cfa8369f977873d36b71dd"} Oct 03 14:37:45 crc kubenswrapper[4962]: I1003 14:37:45.728716 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"147c89e8-3065-48f5-ae75-cb029cd4f447","Type":"ContainerStarted","Data":"99450dbc0e4df6a5d2018c9eeaa5be960d086d241d0ace72d4c5633d745b3f46"} Oct 03 14:37:46 crc kubenswrapper[4962]: I1003 14:37:46.992870 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 14:37:51 crc kubenswrapper[4962]: I1003 14:37:51.782240 4962 generic.go:334] "Generic (PLEG): container finished" podID="44378bf5-5a0a-4e9f-98c9-7bd8b42387da" containerID="94ee6bfa2116a4517d6b8f9e4401a7aeb3c4ad9e15cfa8369f977873d36b71dd" exitCode=0 Oct 03 14:37:51 crc kubenswrapper[4962]: I1003 14:37:51.782332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44378bf5-5a0a-4e9f-98c9-7bd8b42387da","Type":"ContainerDied","Data":"94ee6bfa2116a4517d6b8f9e4401a7aeb3c4ad9e15cfa8369f977873d36b71dd"} Oct 03 14:37:52 crc kubenswrapper[4962]: I1003 14:37:52.794380 4962 generic.go:334] "Generic (PLEG): container finished" podID="147c89e8-3065-48f5-ae75-cb029cd4f447" containerID="99450dbc0e4df6a5d2018c9eeaa5be960d086d241d0ace72d4c5633d745b3f46" exitCode=0 Oct 03 14:37:52 crc kubenswrapper[4962]: I1003 14:37:52.794745 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"147c89e8-3065-48f5-ae75-cb029cd4f447","Type":"ContainerDied","Data":"99450dbc0e4df6a5d2018c9eeaa5be960d086d241d0ace72d4c5633d745b3f46"} Oct 03 14:37:54 crc kubenswrapper[4962]: I1003 14:37:54.659880 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:37:54 crc kubenswrapper[4962]: I1003 14:37:54.663051 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:37:54 crc kubenswrapper[4962]: I1003 14:37:54.663098 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:37:54 crc kubenswrapper[4962]: I1003 14:37:54.663947 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:37:54 crc kubenswrapper[4962]: I1003 14:37:54.664013 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" gracePeriod=600 Oct 03 14:37:54 crc kubenswrapper[4962]: I1003 14:37:54.821174 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" exitCode=0 Oct 03 14:37:54 crc kubenswrapper[4962]: I1003 14:37:54.821224 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256"} Oct 03 14:37:54 crc kubenswrapper[4962]: I1003 14:37:54.821257 4962 scope.go:117] "RemoveContainer" containerID="075a3a2d68fc05b9db35c066365380f2ff374a7b6a1faec1634013ff945a759f" Oct 03 14:37:55 crc kubenswrapper[4962]: E1003 14:37:55.137101 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:37:55 crc kubenswrapper[4962]: I1003 14:37:55.831950 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:37:55 crc kubenswrapper[4962]: E1003 14:37:55.832654 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:38:00 crc kubenswrapper[4962]: I1003 14:38:00.890043 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44378bf5-5a0a-4e9f-98c9-7bd8b42387da","Type":"ContainerStarted","Data":"71455cf3b9cc16f38ca67948321b0fe10008c5e6ea625af6213e10a4951e10bf"} Oct 03 14:38:00 crc kubenswrapper[4962]: I1003 14:38:00.892502 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"147c89e8-3065-48f5-ae75-cb029cd4f447","Type":"ContainerStarted","Data":"b8307131cb1d65187c8728fb1f31f1b0190372b73ff410f7b64ebe09b475e939"} Oct 03 14:38:03 crc kubenswrapper[4962]: I1003 14:38:03.923442 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44378bf5-5a0a-4e9f-98c9-7bd8b42387da","Type":"ContainerStarted","Data":"5debb05fd018b5cbe4d2d8d4f42955e8c42498250fe25bbe49fd1ab77aa08542"} Oct 03 14:38:03 crc kubenswrapper[4962]: I1003 14:38:03.925967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"147c89e8-3065-48f5-ae75-cb029cd4f447","Type":"ContainerStarted","Data":"fb6bfb4b3bff33f44b1f748456fcf2b908318c23aced8d92965856ad70226879"} Oct 03 14:38:03 crc kubenswrapper[4962]: I1003 14:38:03.926242 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 03 14:38:03 crc kubenswrapper[4962]: I1003 14:38:03.930233 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 03 14:38:03 crc kubenswrapper[4962]: I1003 14:38:03.952762 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.536599796 podStartE2EDuration="26.952740247s" podCreationTimestamp="2025-10-03 14:37:37 +0000 UTC" firstStartedPulling="2025-10-03 14:37:39.252882746 +0000 UTC m=+6467.656780581" lastFinishedPulling="2025-10-03 14:37:59.669023197 +0000 UTC m=+6488.072921032" observedRunningTime="2025-10-03 14:38:03.947302612 +0000 UTC m=+6492.351200467" watchObservedRunningTime="2025-10-03 14:38:03.952740247 +0000 UTC m=+6492.356638082" Oct 03 14:38:06 crc kubenswrapper[4962]: I1003 14:38:06.962141 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"44378bf5-5a0a-4e9f-98c9-7bd8b42387da","Type":"ContainerStarted","Data":"2b1bb0e56371b55519283a929d528b56a6321ff9ca3c44195f0dc56d3e7a86d1"} Oct 03 14:38:06 crc kubenswrapper[4962]: I1003 14:38:06.989871 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.224602417 podStartE2EDuration="29.989852311s" podCreationTimestamp="2025-10-03 14:37:37 +0000 UTC" firstStartedPulling="2025-10-03 14:37:40.14103316 +0000 UTC m=+6468.544930995" lastFinishedPulling="2025-10-03 14:38:05.906283054 +0000 UTC m=+6494.310180889" observedRunningTime="2025-10-03 14:38:06.988515626 +0000 UTC m=+6495.392413491" watchObservedRunningTime="2025-10-03 14:38:06.989852311 +0000 UTC m=+6495.393750156" Oct 03 14:38:09 crc kubenswrapper[4962]: I1003 14:38:09.416761 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 14:38:09 crc kubenswrapper[4962]: I1003 14:38:09.417355 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 14:38:09 crc kubenswrapper[4962]: I1003 14:38:09.419451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 14:38:09 crc kubenswrapper[4962]: I1003 14:38:09.999569 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 14:38:10 crc kubenswrapper[4962]: I1003 14:38:10.228797 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:38:10 crc kubenswrapper[4962]: E1003 14:38:10.229546 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.632015 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.634415 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.638770 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.638832 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.678733 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.697815 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.697959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-run-httpd\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.698017 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rncb\" (UniqueName: \"kubernetes.io/projected/d5003405-8745-4637-8d2a-abd18a2929dc-kube-api-access-6rncb\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.698049 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-log-httpd\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.698133 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-config-data\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.698237 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.698274 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-scripts\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.799604 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-run-httpd\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.799701 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rncb\" (UniqueName: \"kubernetes.io/projected/d5003405-8745-4637-8d2a-abd18a2929dc-kube-api-access-6rncb\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.799733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-log-httpd\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.799805 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-config-data\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.799886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.799910 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-scripts\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.799964 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.800151 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-run-httpd\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.801488 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-log-httpd\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.805361 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.806189 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.811361 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-config-data\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.812889 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-scripts\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.822896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rncb\" (UniqueName: \"kubernetes.io/projected/d5003405-8745-4637-8d2a-abd18a2929dc-kube-api-access-6rncb\") pod \"ceilometer-0\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " pod="openstack/ceilometer-0" Oct 03 14:38:11 crc kubenswrapper[4962]: I1003 14:38:11.985978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:38:12 crc kubenswrapper[4962]: I1003 14:38:12.041658 4962 scope.go:117] "RemoveContainer" containerID="dd33081fcdc467722740d05790b9837884bd6833fc620dffc078559f01138d65" Oct 03 14:38:12 crc kubenswrapper[4962]: I1003 14:38:12.172951 4962 scope.go:117] "RemoveContainer" containerID="4030968e65f60914b5a439d72cf60169b10d152c1d3f9d5de04ca20f32f9b0ed" Oct 03 14:38:12 crc kubenswrapper[4962]: I1003 14:38:12.229892 4962 scope.go:117] "RemoveContainer" containerID="74b83a3f091c631c385d468a8b1b295415938af2c71ed0c84c74ebcbd183ff16" Oct 03 14:38:12 crc kubenswrapper[4962]: I1003 14:38:12.257179 4962 scope.go:117] "RemoveContainer" containerID="9e997fc2cdcfe42eb6d1e97272d49f7ecb908656cee0906191ba94b1cff9f8b4" Oct 03 14:38:12 crc kubenswrapper[4962]: I1003 14:38:12.478480 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:12 crc kubenswrapper[4962]: W1003 14:38:12.497160 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5003405_8745_4637_8d2a_abd18a2929dc.slice/crio-b13708235889c0311e114720231452f62efd11e047014eab0137465542530d89 WatchSource:0}: Error finding container b13708235889c0311e114720231452f62efd11e047014eab0137465542530d89: Status 404 returned error can't find the container with id b13708235889c0311e114720231452f62efd11e047014eab0137465542530d89 Oct 03 14:38:13 crc kubenswrapper[4962]: I1003 14:38:13.018730 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerStarted","Data":"b13708235889c0311e114720231452f62efd11e047014eab0137465542530d89"} Oct 03 14:38:14 crc kubenswrapper[4962]: I1003 14:38:14.047443 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6q9fz"] Oct 03 14:38:14 crc kubenswrapper[4962]: I1003 14:38:14.057249 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6q9fz"] Oct 03 14:38:14 crc kubenswrapper[4962]: I1003 14:38:14.058543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerStarted","Data":"4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f"} Oct 03 14:38:14 crc kubenswrapper[4962]: I1003 14:38:14.059406 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerStarted","Data":"de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d"} Oct 03 14:38:14 crc kubenswrapper[4962]: I1003 14:38:14.239439 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98032380-e6f6-4472-b4e5-f21afd8f78d6" path="/var/lib/kubelet/pods/98032380-e6f6-4472-b4e5-f21afd8f78d6/volumes" Oct 03 14:38:15 crc kubenswrapper[4962]: I1003 14:38:15.069701 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerStarted","Data":"ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4"} Oct 03 14:38:17 crc kubenswrapper[4962]: I1003 14:38:17.100427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerStarted","Data":"d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293"} Oct 03 14:38:17 crc kubenswrapper[4962]: I1003 14:38:17.101054 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 14:38:17 crc kubenswrapper[4962]: I1003 14:38:17.130472 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.353342316 podStartE2EDuration="6.130452281s" podCreationTimestamp="2025-10-03 14:38:11 +0000 UTC" firstStartedPulling="2025-10-03 14:38:12.500182856 +0000 UTC m=+6500.904080691" lastFinishedPulling="2025-10-03 14:38:16.277292821 +0000 UTC m=+6504.681190656" observedRunningTime="2025-10-03 14:38:17.119358595 +0000 UTC m=+6505.523256430" watchObservedRunningTime="2025-10-03 14:38:17.130452281 +0000 UTC m=+6505.534350116" Oct 03 14:38:20 crc kubenswrapper[4962]: I1003 14:38:20.887316 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-v748j"] Oct 03 14:38:20 crc kubenswrapper[4962]: I1003 14:38:20.889310 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-v748j" Oct 03 14:38:20 crc kubenswrapper[4962]: I1003 14:38:20.898034 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-v748j"] Oct 03 14:38:21 crc kubenswrapper[4962]: I1003 14:38:21.006105 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lk8r\" (UniqueName: \"kubernetes.io/projected/e2edecf6-a382-402a-ad10-8776c6c34c81-kube-api-access-5lk8r\") pod \"aodh-db-create-v748j\" (UID: \"e2edecf6-a382-402a-ad10-8776c6c34c81\") " pod="openstack/aodh-db-create-v748j" Oct 03 14:38:21 crc kubenswrapper[4962]: I1003 14:38:21.107757 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lk8r\" (UniqueName: \"kubernetes.io/projected/e2edecf6-a382-402a-ad10-8776c6c34c81-kube-api-access-5lk8r\") pod \"aodh-db-create-v748j\" (UID: \"e2edecf6-a382-402a-ad10-8776c6c34c81\") " pod="openstack/aodh-db-create-v748j" Oct 03 14:38:21 crc kubenswrapper[4962]: I1003 14:38:21.127778 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lk8r\" (UniqueName: \"kubernetes.io/projected/e2edecf6-a382-402a-ad10-8776c6c34c81-kube-api-access-5lk8r\") pod \"aodh-db-create-v748j\" (UID: \"e2edecf6-a382-402a-ad10-8776c6c34c81\") " pod="openstack/aodh-db-create-v748j" Oct 03 14:38:21 crc kubenswrapper[4962]: I1003 14:38:21.220952 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-v748j" Oct 03 14:38:21 crc kubenswrapper[4962]: I1003 14:38:21.227351 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:38:21 crc kubenswrapper[4962]: E1003 14:38:21.227712 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:38:21 crc kubenswrapper[4962]: I1003 14:38:21.759994 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-v748j"] Oct 03 14:38:22 crc kubenswrapper[4962]: I1003 14:38:22.150620 4962 generic.go:334] "Generic (PLEG): container finished" podID="e2edecf6-a382-402a-ad10-8776c6c34c81" containerID="6b44bb465493cfe4b0f061bd45787e36399016e18348496648a91b2c858e3def" exitCode=0 Oct 03 14:38:22 crc kubenswrapper[4962]: I1003 14:38:22.150703 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-v748j" event={"ID":"e2edecf6-a382-402a-ad10-8776c6c34c81","Type":"ContainerDied","Data":"6b44bb465493cfe4b0f061bd45787e36399016e18348496648a91b2c858e3def"} Oct 03 14:38:22 crc kubenswrapper[4962]: I1003 14:38:22.150912 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-v748j" event={"ID":"e2edecf6-a382-402a-ad10-8776c6c34c81","Type":"ContainerStarted","Data":"4dd116e65ba94402960f7e8c70bbeed7f2f49f9d9b366b59021a921ec5332754"} Oct 03 14:38:23 crc kubenswrapper[4962]: I1003 14:38:23.571761 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-v748j" Oct 03 14:38:23 crc kubenswrapper[4962]: I1003 14:38:23.655383 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lk8r\" (UniqueName: \"kubernetes.io/projected/e2edecf6-a382-402a-ad10-8776c6c34c81-kube-api-access-5lk8r\") pod \"e2edecf6-a382-402a-ad10-8776c6c34c81\" (UID: \"e2edecf6-a382-402a-ad10-8776c6c34c81\") " Oct 03 14:38:23 crc kubenswrapper[4962]: I1003 14:38:23.661887 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2edecf6-a382-402a-ad10-8776c6c34c81-kube-api-access-5lk8r" (OuterVolumeSpecName: "kube-api-access-5lk8r") pod "e2edecf6-a382-402a-ad10-8776c6c34c81" (UID: "e2edecf6-a382-402a-ad10-8776c6c34c81"). InnerVolumeSpecName "kube-api-access-5lk8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:23 crc kubenswrapper[4962]: I1003 14:38:23.757910 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lk8r\" (UniqueName: \"kubernetes.io/projected/e2edecf6-a382-402a-ad10-8776c6c34c81-kube-api-access-5lk8r\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:24 crc kubenswrapper[4962]: I1003 14:38:24.049922 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5c1b-account-create-ntkjx"] Oct 03 14:38:24 crc kubenswrapper[4962]: I1003 14:38:24.075709 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5c1b-account-create-ntkjx"] Oct 03 14:38:24 crc kubenswrapper[4962]: I1003 14:38:24.169892 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-v748j" event={"ID":"e2edecf6-a382-402a-ad10-8776c6c34c81","Type":"ContainerDied","Data":"4dd116e65ba94402960f7e8c70bbeed7f2f49f9d9b366b59021a921ec5332754"} Oct 03 14:38:24 crc kubenswrapper[4962]: I1003 14:38:24.169936 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dd116e65ba94402960f7e8c70bbeed7f2f49f9d9b366b59021a921ec5332754" Oct 03 14:38:24 crc kubenswrapper[4962]: I1003 14:38:24.169943 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-v748j" Oct 03 14:38:24 crc kubenswrapper[4962]: I1003 14:38:24.240242 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836" path="/var/lib/kubelet/pods/c852e8e4-8bb5-4f0a-bad3-09d0fb0bf836/volumes" Oct 03 14:38:30 crc kubenswrapper[4962]: I1003 14:38:30.976445 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-3d7c-account-create-w8922"] Oct 03 14:38:30 crc kubenswrapper[4962]: E1003 14:38:30.977411 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2edecf6-a382-402a-ad10-8776c6c34c81" containerName="mariadb-database-create" Oct 03 14:38:30 crc kubenswrapper[4962]: I1003 14:38:30.977424 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2edecf6-a382-402a-ad10-8776c6c34c81" containerName="mariadb-database-create" Oct 03 14:38:30 crc kubenswrapper[4962]: I1003 14:38:30.977624 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2edecf6-a382-402a-ad10-8776c6c34c81" containerName="mariadb-database-create" Oct 03 14:38:30 crc kubenswrapper[4962]: I1003 14:38:30.978430 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3d7c-account-create-w8922" Oct 03 14:38:30 crc kubenswrapper[4962]: I1003 14:38:30.995356 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 03 14:38:31 crc kubenswrapper[4962]: I1003 14:38:31.004798 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-3d7c-account-create-w8922"] Oct 03 14:38:31 crc kubenswrapper[4962]: I1003 14:38:31.102701 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfmvz\" (UniqueName: \"kubernetes.io/projected/dd068190-fdd3-4767-a4b6-873321e9117e-kube-api-access-rfmvz\") pod \"aodh-3d7c-account-create-w8922\" (UID: \"dd068190-fdd3-4767-a4b6-873321e9117e\") " pod="openstack/aodh-3d7c-account-create-w8922" Oct 03 14:38:31 crc kubenswrapper[4962]: I1003 14:38:31.204474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfmvz\" (UniqueName: \"kubernetes.io/projected/dd068190-fdd3-4767-a4b6-873321e9117e-kube-api-access-rfmvz\") pod \"aodh-3d7c-account-create-w8922\" (UID: \"dd068190-fdd3-4767-a4b6-873321e9117e\") " pod="openstack/aodh-3d7c-account-create-w8922" Oct 03 14:38:31 crc kubenswrapper[4962]: I1003 14:38:31.231369 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfmvz\" (UniqueName: \"kubernetes.io/projected/dd068190-fdd3-4767-a4b6-873321e9117e-kube-api-access-rfmvz\") pod \"aodh-3d7c-account-create-w8922\" (UID: \"dd068190-fdd3-4767-a4b6-873321e9117e\") " pod="openstack/aodh-3d7c-account-create-w8922" Oct 03 14:38:31 crc kubenswrapper[4962]: I1003 14:38:31.295936 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3d7c-account-create-w8922" Oct 03 14:38:31 crc kubenswrapper[4962]: I1003 14:38:31.762044 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-3d7c-account-create-w8922"] Oct 03 14:38:31 crc kubenswrapper[4962]: W1003 14:38:31.767245 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd068190_fdd3_4767_a4b6_873321e9117e.slice/crio-4a98b6592d09225b583b8cf68f83c5cbcc72b9e9b5874c50ddf58e373c918066 WatchSource:0}: Error finding container 4a98b6592d09225b583b8cf68f83c5cbcc72b9e9b5874c50ddf58e373c918066: Status 404 returned error can't find the container with id 4a98b6592d09225b583b8cf68f83c5cbcc72b9e9b5874c50ddf58e373c918066 Oct 03 14:38:32 crc kubenswrapper[4962]: I1003 14:38:32.030443 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mm8m8"] Oct 03 14:38:32 crc kubenswrapper[4962]: I1003 14:38:32.042874 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mm8m8"] Oct 03 14:38:32 crc kubenswrapper[4962]: I1003 14:38:32.234655 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:38:32 crc kubenswrapper[4962]: E1003 14:38:32.234948 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:38:32 crc kubenswrapper[4962]: I1003 14:38:32.249119 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54553a51-c0a4-445e-9415-e3e7373c56a7" path="/var/lib/kubelet/pods/54553a51-c0a4-445e-9415-e3e7373c56a7/volumes" Oct 03 14:38:32 crc kubenswrapper[4962]: I1003 14:38:32.267269 4962 generic.go:334] "Generic (PLEG): container finished" podID="dd068190-fdd3-4767-a4b6-873321e9117e" containerID="071c2a146cfa412b6979bbe0fdeb312fd8456a1c9ce3044aecbd9e45a12300c1" exitCode=0 Oct 03 14:38:32 crc kubenswrapper[4962]: I1003 14:38:32.267329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3d7c-account-create-w8922" event={"ID":"dd068190-fdd3-4767-a4b6-873321e9117e","Type":"ContainerDied","Data":"071c2a146cfa412b6979bbe0fdeb312fd8456a1c9ce3044aecbd9e45a12300c1"} Oct 03 14:38:32 crc kubenswrapper[4962]: I1003 14:38:32.267393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3d7c-account-create-w8922" event={"ID":"dd068190-fdd3-4767-a4b6-873321e9117e","Type":"ContainerStarted","Data":"4a98b6592d09225b583b8cf68f83c5cbcc72b9e9b5874c50ddf58e373c918066"} Oct 03 14:38:33 crc kubenswrapper[4962]: I1003 14:38:33.634271 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3d7c-account-create-w8922" Oct 03 14:38:33 crc kubenswrapper[4962]: I1003 14:38:33.759413 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfmvz\" (UniqueName: \"kubernetes.io/projected/dd068190-fdd3-4767-a4b6-873321e9117e-kube-api-access-rfmvz\") pod \"dd068190-fdd3-4767-a4b6-873321e9117e\" (UID: \"dd068190-fdd3-4767-a4b6-873321e9117e\") " Oct 03 14:38:33 crc kubenswrapper[4962]: I1003 14:38:33.765170 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd068190-fdd3-4767-a4b6-873321e9117e-kube-api-access-rfmvz" (OuterVolumeSpecName: "kube-api-access-rfmvz") pod "dd068190-fdd3-4767-a4b6-873321e9117e" (UID: "dd068190-fdd3-4767-a4b6-873321e9117e"). InnerVolumeSpecName "kube-api-access-rfmvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:33 crc kubenswrapper[4962]: I1003 14:38:33.861975 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfmvz\" (UniqueName: \"kubernetes.io/projected/dd068190-fdd3-4767-a4b6-873321e9117e-kube-api-access-rfmvz\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:34 crc kubenswrapper[4962]: I1003 14:38:34.285818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3d7c-account-create-w8922" event={"ID":"dd068190-fdd3-4767-a4b6-873321e9117e","Type":"ContainerDied","Data":"4a98b6592d09225b583b8cf68f83c5cbcc72b9e9b5874c50ddf58e373c918066"} Oct 03 14:38:34 crc kubenswrapper[4962]: I1003 14:38:34.285867 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a98b6592d09225b583b8cf68f83c5cbcc72b9e9b5874c50ddf58e373c918066" Oct 03 14:38:34 crc kubenswrapper[4962]: I1003 14:38:34.285865 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3d7c-account-create-w8922" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.399279 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-cmwqv"] Oct 03 14:38:36 crc kubenswrapper[4962]: E1003 14:38:36.400588 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd068190-fdd3-4767-a4b6-873321e9117e" containerName="mariadb-account-create" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.400606 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd068190-fdd3-4767-a4b6-873321e9117e" containerName="mariadb-account-create" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.401028 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd068190-fdd3-4767-a4b6-873321e9117e" containerName="mariadb-account-create" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.402577 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.404305 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mzmlh" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.404517 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.404959 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.414853 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-cmwqv"] Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.528830 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-scripts\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.528904 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wjg8\" (UniqueName: \"kubernetes.io/projected/2880ced7-28e6-476c-94cb-72a9d7a7f33e-kube-api-access-2wjg8\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.529251 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-config-data\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.529591 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-combined-ca-bundle\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.631247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-combined-ca-bundle\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.631553 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-scripts\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.631730 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wjg8\" (UniqueName: \"kubernetes.io/projected/2880ced7-28e6-476c-94cb-72a9d7a7f33e-kube-api-access-2wjg8\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.632194 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-config-data\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.636595 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-combined-ca-bundle\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.636979 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-config-data\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.637333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-scripts\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.648451 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wjg8\" (UniqueName: \"kubernetes.io/projected/2880ced7-28e6-476c-94cb-72a9d7a7f33e-kube-api-access-2wjg8\") pod \"aodh-db-sync-cmwqv\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:36 crc kubenswrapper[4962]: I1003 14:38:36.722262 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:37 crc kubenswrapper[4962]: I1003 14:38:37.215799 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-cmwqv"] Oct 03 14:38:37 crc kubenswrapper[4962]: W1003 14:38:37.219500 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2880ced7_28e6_476c_94cb_72a9d7a7f33e.slice/crio-3905c7392a7008eaaaa5f8f87fe186cc9163d75f259fde21e3859fdc995fd8f6 WatchSource:0}: Error finding container 3905c7392a7008eaaaa5f8f87fe186cc9163d75f259fde21e3859fdc995fd8f6: Status 404 returned error can't find the container with id 3905c7392a7008eaaaa5f8f87fe186cc9163d75f259fde21e3859fdc995fd8f6 Oct 03 14:38:37 crc kubenswrapper[4962]: I1003 14:38:37.324388 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-cmwqv" event={"ID":"2880ced7-28e6-476c-94cb-72a9d7a7f33e","Type":"ContainerStarted","Data":"3905c7392a7008eaaaa5f8f87fe186cc9163d75f259fde21e3859fdc995fd8f6"} Oct 03 14:38:41 crc kubenswrapper[4962]: I1003 14:38:41.992117 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 14:38:45 crc kubenswrapper[4962]: I1003 14:38:45.226713 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:38:45 crc kubenswrapper[4962]: E1003 14:38:45.227554 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:38:45 crc kubenswrapper[4962]: I1003 14:38:45.396334 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-cmwqv" event={"ID":"2880ced7-28e6-476c-94cb-72a9d7a7f33e","Type":"ContainerStarted","Data":"d04390e033c630a90a7c81819158cf7efb26397c2099d41f2ebc31161f2e23f5"} Oct 03 14:38:45 crc kubenswrapper[4962]: I1003 14:38:45.413361 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-cmwqv" podStartSLOduration=1.7906248420000002 podStartE2EDuration="9.413343199s" podCreationTimestamp="2025-10-03 14:38:36 +0000 UTC" firstStartedPulling="2025-10-03 14:38:37.222696972 +0000 UTC m=+6525.626594807" lastFinishedPulling="2025-10-03 14:38:44.845415329 +0000 UTC m=+6533.249313164" observedRunningTime="2025-10-03 14:38:45.408297055 +0000 UTC m=+6533.812194890" watchObservedRunningTime="2025-10-03 14:38:45.413343199 +0000 UTC m=+6533.817241044" Oct 03 14:38:47 crc kubenswrapper[4962]: I1003 14:38:47.414691 4962 generic.go:334] "Generic (PLEG): container finished" podID="2880ced7-28e6-476c-94cb-72a9d7a7f33e" containerID="d04390e033c630a90a7c81819158cf7efb26397c2099d41f2ebc31161f2e23f5" exitCode=0 Oct 03 14:38:47 crc kubenswrapper[4962]: I1003 14:38:47.414775 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-cmwqv" event={"ID":"2880ced7-28e6-476c-94cb-72a9d7a7f33e","Type":"ContainerDied","Data":"d04390e033c630a90a7c81819158cf7efb26397c2099d41f2ebc31161f2e23f5"} Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.851588 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.896752 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-combined-ca-bundle\") pod \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.896827 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wjg8\" (UniqueName: \"kubernetes.io/projected/2880ced7-28e6-476c-94cb-72a9d7a7f33e-kube-api-access-2wjg8\") pod \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.897069 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-config-data\") pod \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.897134 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-scripts\") pod \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\" (UID: \"2880ced7-28e6-476c-94cb-72a9d7a7f33e\") " Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.902363 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2880ced7-28e6-476c-94cb-72a9d7a7f33e-kube-api-access-2wjg8" (OuterVolumeSpecName: "kube-api-access-2wjg8") pod "2880ced7-28e6-476c-94cb-72a9d7a7f33e" (UID: "2880ced7-28e6-476c-94cb-72a9d7a7f33e"). InnerVolumeSpecName "kube-api-access-2wjg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.903689 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-scripts" (OuterVolumeSpecName: "scripts") pod "2880ced7-28e6-476c-94cb-72a9d7a7f33e" (UID: "2880ced7-28e6-476c-94cb-72a9d7a7f33e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.928102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-config-data" (OuterVolumeSpecName: "config-data") pod "2880ced7-28e6-476c-94cb-72a9d7a7f33e" (UID: "2880ced7-28e6-476c-94cb-72a9d7a7f33e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.941037 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2880ced7-28e6-476c-94cb-72a9d7a7f33e" (UID: "2880ced7-28e6-476c-94cb-72a9d7a7f33e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.999608 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.999656 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.999667 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2880ced7-28e6-476c-94cb-72a9d7a7f33e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:48 crc kubenswrapper[4962]: I1003 14:38:48.999685 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wjg8\" (UniqueName: \"kubernetes.io/projected/2880ced7-28e6-476c-94cb-72a9d7a7f33e-kube-api-access-2wjg8\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:49 crc kubenswrapper[4962]: I1003 14:38:49.436736 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-cmwqv" event={"ID":"2880ced7-28e6-476c-94cb-72a9d7a7f33e","Type":"ContainerDied","Data":"3905c7392a7008eaaaa5f8f87fe186cc9163d75f259fde21e3859fdc995fd8f6"} Oct 03 14:38:49 crc kubenswrapper[4962]: I1003 14:38:49.437149 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3905c7392a7008eaaaa5f8f87fe186cc9163d75f259fde21e3859fdc995fd8f6" Oct 03 14:38:49 crc kubenswrapper[4962]: I1003 14:38:49.436832 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-cmwqv" Oct 03 14:38:49 crc kubenswrapper[4962]: E1003 14:38:49.523891 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2880ced7_28e6_476c_94cb_72a9d7a7f33e.slice\": RecentStats: unable to find data in memory cache]" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.516965 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 03 14:38:51 crc kubenswrapper[4962]: E1003 14:38:51.517799 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2880ced7-28e6-476c-94cb-72a9d7a7f33e" containerName="aodh-db-sync" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.517817 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2880ced7-28e6-476c-94cb-72a9d7a7f33e" containerName="aodh-db-sync" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.518156 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2880ced7-28e6-476c-94cb-72a9d7a7f33e" containerName="aodh-db-sync" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.520528 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.532139 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.532517 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.532887 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mzmlh" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.556835 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.661751 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdmp\" (UniqueName: \"kubernetes.io/projected/57e8140d-8b35-48b0-a27c-4f1279c29f5c-kube-api-access-bgdmp\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.661849 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-config-data\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.661888 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-scripts\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.661907 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.763615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdmp\" (UniqueName: \"kubernetes.io/projected/57e8140d-8b35-48b0-a27c-4f1279c29f5c-kube-api-access-bgdmp\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.763729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-config-data\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.763769 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-scripts\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.763788 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.772207 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-scripts\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.772385 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.772743 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e8140d-8b35-48b0-a27c-4f1279c29f5c-config-data\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.780458 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdmp\" (UniqueName: \"kubernetes.io/projected/57e8140d-8b35-48b0-a27c-4f1279c29f5c-kube-api-access-bgdmp\") pod \"aodh-0\" (UID: \"57e8140d-8b35-48b0-a27c-4f1279c29f5c\") " pod="openstack/aodh-0" Oct 03 14:38:51 crc kubenswrapper[4962]: I1003 14:38:51.878716 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 14:38:52 crc kubenswrapper[4962]: I1003 14:38:52.402392 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 14:38:52 crc kubenswrapper[4962]: I1003 14:38:52.467329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"57e8140d-8b35-48b0-a27c-4f1279c29f5c","Type":"ContainerStarted","Data":"e7f247d8b4253eee433935964e4de633ff938d9a54cc0cba5facc8e779c73131"} Oct 03 14:38:53 crc kubenswrapper[4962]: I1003 14:38:53.478724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"57e8140d-8b35-48b0-a27c-4f1279c29f5c","Type":"ContainerStarted","Data":"7da1cdb868eddb22b0bd50181068f8c3c16126130226abe2a875df08f15013e1"} Oct 03 14:38:53 crc kubenswrapper[4962]: I1003 14:38:53.732051 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:53 crc kubenswrapper[4962]: I1003 14:38:53.732591 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="ceilometer-central-agent" containerID="cri-o://de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d" gracePeriod=30 Oct 03 14:38:53 crc kubenswrapper[4962]: I1003 14:38:53.732756 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="ceilometer-notification-agent" containerID="cri-o://4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f" gracePeriod=30 Oct 03 14:38:53 crc kubenswrapper[4962]: I1003 14:38:53.732833 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="sg-core" containerID="cri-o://ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4" gracePeriod=30 Oct 03 14:38:53 crc kubenswrapper[4962]: I1003 14:38:53.732773 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="proxy-httpd" containerID="cri-o://d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293" gracePeriod=30 Oct 03 14:38:54 crc kubenswrapper[4962]: I1003 14:38:54.489092 4962 generic.go:334] "Generic (PLEG): container finished" podID="d5003405-8745-4637-8d2a-abd18a2929dc" containerID="d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293" exitCode=0 Oct 03 14:38:54 crc kubenswrapper[4962]: I1003 14:38:54.489119 4962 generic.go:334] "Generic (PLEG): container finished" podID="d5003405-8745-4637-8d2a-abd18a2929dc" containerID="ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4" exitCode=2 Oct 03 14:38:54 crc kubenswrapper[4962]: I1003 14:38:54.489126 4962 generic.go:334] "Generic (PLEG): container finished" podID="d5003405-8745-4637-8d2a-abd18a2929dc" containerID="de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d" exitCode=0 Oct 03 14:38:54 crc kubenswrapper[4962]: I1003 14:38:54.489145 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerDied","Data":"d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293"} Oct 03 14:38:54 crc kubenswrapper[4962]: I1003 14:38:54.489169 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerDied","Data":"ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4"} Oct 03 14:38:54 crc kubenswrapper[4962]: I1003 14:38:54.489180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerDied","Data":"de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d"} Oct 03 14:38:56 crc kubenswrapper[4962]: I1003 14:38:56.510970 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"57e8140d-8b35-48b0-a27c-4f1279c29f5c","Type":"ContainerStarted","Data":"32bc6698fac1066bb070fb0c118a6a71a565f616d6d691d200d4fe90470312de"} Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.076989 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.168924 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-sg-core-conf-yaml\") pod \"d5003405-8745-4637-8d2a-abd18a2929dc\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.168960 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-scripts\") pod \"d5003405-8745-4637-8d2a-abd18a2929dc\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.169063 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-run-httpd\") pod \"d5003405-8745-4637-8d2a-abd18a2929dc\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.169321 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rncb\" (UniqueName: \"kubernetes.io/projected/d5003405-8745-4637-8d2a-abd18a2929dc-kube-api-access-6rncb\") pod \"d5003405-8745-4637-8d2a-abd18a2929dc\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.169439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-log-httpd\") pod \"d5003405-8745-4637-8d2a-abd18a2929dc\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.169527 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-config-data\") pod \"d5003405-8745-4637-8d2a-abd18a2929dc\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.169551 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-combined-ca-bundle\") pod \"d5003405-8745-4637-8d2a-abd18a2929dc\" (UID: \"d5003405-8745-4637-8d2a-abd18a2929dc\") " Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.171027 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d5003405-8745-4637-8d2a-abd18a2929dc" (UID: "d5003405-8745-4637-8d2a-abd18a2929dc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.171162 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d5003405-8745-4637-8d2a-abd18a2929dc" (UID: "d5003405-8745-4637-8d2a-abd18a2929dc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.171704 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.171724 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5003405-8745-4637-8d2a-abd18a2929dc-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.175914 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-scripts" (OuterVolumeSpecName: "scripts") pod "d5003405-8745-4637-8d2a-abd18a2929dc" (UID: "d5003405-8745-4637-8d2a-abd18a2929dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.176000 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5003405-8745-4637-8d2a-abd18a2929dc-kube-api-access-6rncb" (OuterVolumeSpecName: "kube-api-access-6rncb") pod "d5003405-8745-4637-8d2a-abd18a2929dc" (UID: "d5003405-8745-4637-8d2a-abd18a2929dc"). InnerVolumeSpecName "kube-api-access-6rncb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.210866 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d5003405-8745-4637-8d2a-abd18a2929dc" (UID: "d5003405-8745-4637-8d2a-abd18a2929dc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.250258 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5003405-8745-4637-8d2a-abd18a2929dc" (UID: "d5003405-8745-4637-8d2a-abd18a2929dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.273762 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.273801 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.273813 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rncb\" (UniqueName: \"kubernetes.io/projected/d5003405-8745-4637-8d2a-abd18a2929dc-kube-api-access-6rncb\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.273824 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.275192 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-config-data" (OuterVolumeSpecName: "config-data") pod "d5003405-8745-4637-8d2a-abd18a2929dc" (UID: "d5003405-8745-4637-8d2a-abd18a2929dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.375392 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5003405-8745-4637-8d2a-abd18a2929dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.521469 4962 generic.go:334] "Generic (PLEG): container finished" podID="d5003405-8745-4637-8d2a-abd18a2929dc" containerID="4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f" exitCode=0 Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.521508 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerDied","Data":"4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f"} Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.521538 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5003405-8745-4637-8d2a-abd18a2929dc","Type":"ContainerDied","Data":"b13708235889c0311e114720231452f62efd11e047014eab0137465542530d89"} Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.521559 4962 scope.go:117] "RemoveContainer" containerID="d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.521694 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.552095 4962 scope.go:117] "RemoveContainer" containerID="ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.557458 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.572658 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.574003 4962 scope.go:117] "RemoveContainer" containerID="4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.586377 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:57 crc kubenswrapper[4962]: E1003 14:38:57.587119 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="sg-core" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.587147 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="sg-core" Oct 03 14:38:57 crc kubenswrapper[4962]: E1003 14:38:57.587168 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="ceilometer-central-agent" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.587177 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="ceilometer-central-agent" Oct 03 14:38:57 crc kubenswrapper[4962]: E1003 14:38:57.587218 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="proxy-httpd" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.587227 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="proxy-httpd" Oct 03 14:38:57 crc kubenswrapper[4962]: E1003 14:38:57.587252 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="ceilometer-notification-agent" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.587260 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="ceilometer-notification-agent" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.587541 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="ceilometer-notification-agent" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.587567 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="ceilometer-central-agent" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.587583 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="sg-core" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.587608 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" containerName="proxy-httpd" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.589522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.593115 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.593216 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.604162 4962 scope.go:117] "RemoveContainer" containerID="de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.614521 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.636222 4962 scope.go:117] "RemoveContainer" containerID="d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293" Oct 03 14:38:57 crc kubenswrapper[4962]: E1003 14:38:57.636855 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293\": container with ID starting with d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293 not found: ID does not exist" containerID="d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.637049 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293"} err="failed to get container status \"d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293\": rpc error: code = NotFound desc = could not find container \"d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293\": container with ID starting with d304e7c7090cc46f6dc67da64062c9776dd406199980947bf14dd126b7c95293 not found: ID does not exist" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.637085 4962 scope.go:117] "RemoveContainer" containerID="ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4" Oct 03 14:38:57 crc kubenswrapper[4962]: E1003 14:38:57.637351 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4\": container with ID starting with ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4 not found: ID does not exist" containerID="ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.637381 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4"} err="failed to get container status \"ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4\": rpc error: code = NotFound desc = could not find container \"ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4\": container with ID starting with ffd54043145dc1c06b77cdec7acf61fca10a1c29b840ef62213a3312a13692e4 not found: ID does not exist" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.637400 4962 scope.go:117] "RemoveContainer" containerID="4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f" Oct 03 14:38:57 crc kubenswrapper[4962]: E1003 14:38:57.637585 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f\": container with ID starting with 4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f not found: ID does not exist" containerID="4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.637607 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f"} err="failed to get container status \"4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f\": rpc error: code = NotFound desc = could not find container \"4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f\": container with ID starting with 4dc5b7c06728c5477aece923c305e09b3f91257bf4f8fe21990207701a73058f not found: ID does not exist" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.637621 4962 scope.go:117] "RemoveContainer" containerID="de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d" Oct 03 14:38:57 crc kubenswrapper[4962]: E1003 14:38:57.637822 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d\": container with ID starting with de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d not found: ID does not exist" containerID="de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.637850 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d"} err="failed to get container status \"de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d\": rpc error: code = NotFound desc = could not find container \"de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d\": container with ID starting with de3153f8b8eb9aea8401d3157510044d55d5c2eb6603a26c5f42e12845fe347d not found: ID does not exist" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.681252 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw9tr\" (UniqueName: \"kubernetes.io/projected/70003b92-6321-4e11-97b5-f25362d6d29d-kube-api-access-fw9tr\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.681312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.681387 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-log-httpd\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.681480 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-run-httpd\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.681514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.681548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-config-data\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.681577 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-scripts\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.784115 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-log-httpd\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.784533 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-run-httpd\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.784680 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.784854 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-config-data\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.784990 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-scripts\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.785204 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw9tr\" (UniqueName: \"kubernetes.io/projected/70003b92-6321-4e11-97b5-f25362d6d29d-kube-api-access-fw9tr\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.784891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-run-httpd\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.784739 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-log-httpd\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.785462 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.790123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-scripts\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.792301 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.792455 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.795813 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-config-data\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.807800 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw9tr\" (UniqueName: \"kubernetes.io/projected/70003b92-6321-4e11-97b5-f25362d6d29d-kube-api-access-fw9tr\") pod \"ceilometer-0\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " pod="openstack/ceilometer-0" Oct 03 14:38:57 crc kubenswrapper[4962]: I1003 14:38:57.913183 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:38:58 crc kubenswrapper[4962]: I1003 14:38:58.242057 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5003405-8745-4637-8d2a-abd18a2929dc" path="/var/lib/kubelet/pods/d5003405-8745-4637-8d2a-abd18a2929dc/volumes" Oct 03 14:38:58 crc kubenswrapper[4962]: I1003 14:38:58.357246 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:38:58 crc kubenswrapper[4962]: I1003 14:38:58.538853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerStarted","Data":"efd9b883cc08d6a0b1ac834e41ebaac40b8406dc744a270b082ece8dd3785185"} Oct 03 14:38:59 crc kubenswrapper[4962]: I1003 14:38:59.229463 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:38:59 crc kubenswrapper[4962]: E1003 14:38:59.233177 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:38:59 crc kubenswrapper[4962]: I1003 14:38:59.549583 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerStarted","Data":"afe94f2aeba3732068aedd5205f7c7a8b1466f92c8bdc8cf72d7bb2b64cf689a"} Oct 03 14:38:59 crc kubenswrapper[4962]: I1003 14:38:59.553208 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"57e8140d-8b35-48b0-a27c-4f1279c29f5c","Type":"ContainerStarted","Data":"a743cb59c8f39fee97595a6f84d609ea0f88975be8b1a714e34f7cfe9a16b52c"} Oct 03 14:39:00 crc kubenswrapper[4962]: I1003 14:39:00.569560 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerStarted","Data":"764d02f035315964298f451338fe9be9a5741d3ca019716d964e895a8822b49b"} Oct 03 14:39:01 crc kubenswrapper[4962]: I1003 14:39:01.580141 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerStarted","Data":"58a18586e670b7ad5d11b44daf34e3b66e275de56df9250ec49de92f0cc28b5d"} Oct 03 14:39:02 crc kubenswrapper[4962]: I1003 14:39:02.612587 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"57e8140d-8b35-48b0-a27c-4f1279c29f5c","Type":"ContainerStarted","Data":"94a585578711911d7a654cc413e7e04c46a6b9bb66c06d3258ecd08c36058296"} Oct 03 14:39:02 crc kubenswrapper[4962]: I1003 14:39:02.648261 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.518590771 podStartE2EDuration="11.648236691s" podCreationTimestamp="2025-10-03 14:38:51 +0000 UTC" firstStartedPulling="2025-10-03 14:38:52.424678418 +0000 UTC m=+6540.828576253" lastFinishedPulling="2025-10-03 14:39:01.554324338 +0000 UTC m=+6549.958222173" observedRunningTime="2025-10-03 14:39:02.630726034 +0000 UTC m=+6551.034623879" watchObservedRunningTime="2025-10-03 14:39:02.648236691 +0000 UTC m=+6551.052134526" Oct 03 14:39:03 crc kubenswrapper[4962]: I1003 14:39:03.625517 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerStarted","Data":"25cc75d6e4777cdc689c7d8f33cf836d01709650997197823544bd89ff33e926"} Oct 03 14:39:03 crc kubenswrapper[4962]: I1003 14:39:03.646695 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.457692964 podStartE2EDuration="6.646668847s" podCreationTimestamp="2025-10-03 14:38:57 +0000 UTC" firstStartedPulling="2025-10-03 14:38:58.369333029 +0000 UTC m=+6546.773230864" lastFinishedPulling="2025-10-03 14:39:02.558308912 +0000 UTC m=+6550.962206747" observedRunningTime="2025-10-03 14:39:03.644051187 +0000 UTC m=+6552.047949042" watchObservedRunningTime="2025-10-03 14:39:03.646668847 +0000 UTC m=+6552.050566692" Oct 03 14:39:04 crc kubenswrapper[4962]: I1003 14:39:04.636258 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 14:39:08 crc kubenswrapper[4962]: I1003 14:39:08.325661 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-brhld"] Oct 03 14:39:08 crc kubenswrapper[4962]: I1003 14:39:08.327880 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-brhld" Oct 03 14:39:08 crc kubenswrapper[4962]: I1003 14:39:08.334802 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-brhld"] Oct 03 14:39:08 crc kubenswrapper[4962]: I1003 14:39:08.430036 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9nmv\" (UniqueName: \"kubernetes.io/projected/484190c6-e084-4d4a-9f78-a1d6a7c0d814-kube-api-access-n9nmv\") pod \"manila-db-create-brhld\" (UID: \"484190c6-e084-4d4a-9f78-a1d6a7c0d814\") " pod="openstack/manila-db-create-brhld" Oct 03 14:39:08 crc kubenswrapper[4962]: I1003 14:39:08.532371 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9nmv\" (UniqueName: \"kubernetes.io/projected/484190c6-e084-4d4a-9f78-a1d6a7c0d814-kube-api-access-n9nmv\") pod \"manila-db-create-brhld\" (UID: \"484190c6-e084-4d4a-9f78-a1d6a7c0d814\") " pod="openstack/manila-db-create-brhld" Oct 03 14:39:08 crc kubenswrapper[4962]: I1003 14:39:08.551942 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9nmv\" (UniqueName: \"kubernetes.io/projected/484190c6-e084-4d4a-9f78-a1d6a7c0d814-kube-api-access-n9nmv\") pod \"manila-db-create-brhld\" (UID: \"484190c6-e084-4d4a-9f78-a1d6a7c0d814\") " pod="openstack/manila-db-create-brhld" Oct 03 14:39:08 crc kubenswrapper[4962]: I1003 14:39:08.654194 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-brhld" Oct 03 14:39:09 crc kubenswrapper[4962]: I1003 14:39:09.196432 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-brhld"] Oct 03 14:39:09 crc kubenswrapper[4962]: I1003 14:39:09.691468 4962 generic.go:334] "Generic (PLEG): container finished" podID="484190c6-e084-4d4a-9f78-a1d6a7c0d814" containerID="b6c9a3d3cfd5d26a7f7f5cc088e51968efbfbfd86ed2c6329733b1c6d2bba66a" exitCode=0 Oct 03 14:39:09 crc kubenswrapper[4962]: I1003 14:39:09.691523 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-brhld" event={"ID":"484190c6-e084-4d4a-9f78-a1d6a7c0d814","Type":"ContainerDied","Data":"b6c9a3d3cfd5d26a7f7f5cc088e51968efbfbfd86ed2c6329733b1c6d2bba66a"} Oct 03 14:39:09 crc kubenswrapper[4962]: I1003 14:39:09.691845 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-brhld" event={"ID":"484190c6-e084-4d4a-9f78-a1d6a7c0d814","Type":"ContainerStarted","Data":"4d52a2fdb7f0f1649e3934ba9e4fcddaee82179e0b954eb2b5deb38d7fc30538"} Oct 03 14:39:11 crc kubenswrapper[4962]: I1003 14:39:11.128894 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-brhld" Oct 03 14:39:11 crc kubenswrapper[4962]: I1003 14:39:11.182886 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9nmv\" (UniqueName: \"kubernetes.io/projected/484190c6-e084-4d4a-9f78-a1d6a7c0d814-kube-api-access-n9nmv\") pod \"484190c6-e084-4d4a-9f78-a1d6a7c0d814\" (UID: \"484190c6-e084-4d4a-9f78-a1d6a7c0d814\") " Oct 03 14:39:11 crc kubenswrapper[4962]: I1003 14:39:11.188876 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484190c6-e084-4d4a-9f78-a1d6a7c0d814-kube-api-access-n9nmv" (OuterVolumeSpecName: "kube-api-access-n9nmv") pod "484190c6-e084-4d4a-9f78-a1d6a7c0d814" (UID: "484190c6-e084-4d4a-9f78-a1d6a7c0d814"). InnerVolumeSpecName "kube-api-access-n9nmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:39:11 crc kubenswrapper[4962]: I1003 14:39:11.227497 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:39:11 crc kubenswrapper[4962]: E1003 14:39:11.227831 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:39:11 crc kubenswrapper[4962]: I1003 14:39:11.285712 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9nmv\" (UniqueName: \"kubernetes.io/projected/484190c6-e084-4d4a-9f78-a1d6a7c0d814-kube-api-access-n9nmv\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:11 crc kubenswrapper[4962]: I1003 14:39:11.709887 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-brhld" event={"ID":"484190c6-e084-4d4a-9f78-a1d6a7c0d814","Type":"ContainerDied","Data":"4d52a2fdb7f0f1649e3934ba9e4fcddaee82179e0b954eb2b5deb38d7fc30538"} Oct 03 14:39:11 crc kubenswrapper[4962]: I1003 14:39:11.709922 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d52a2fdb7f0f1649e3934ba9e4fcddaee82179e0b954eb2b5deb38d7fc30538" Oct 03 14:39:11 crc kubenswrapper[4962]: I1003 14:39:11.709933 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-brhld" Oct 03 14:39:12 crc kubenswrapper[4962]: I1003 14:39:12.399870 4962 scope.go:117] "RemoveContainer" containerID="9067176d1c9eca3683cb7bdced5aa755b678a6cb65168410343eec3901ce1df9" Oct 03 14:39:12 crc kubenswrapper[4962]: I1003 14:39:12.430607 4962 scope.go:117] "RemoveContainer" containerID="e09dd6270db59d7ac784dd1277c11c7cfecc7d85e9ad03ce28c2d6e5fdfc80c2" Oct 03 14:39:12 crc kubenswrapper[4962]: I1003 14:39:12.482417 4962 scope.go:117] "RemoveContainer" containerID="4bdecd72f95dab8313827b86da82e6252270febdfce63429a42e19046f358360" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.359095 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-7b9d-account-create-87dh8"] Oct 03 14:39:18 crc kubenswrapper[4962]: E1003 14:39:18.359988 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484190c6-e084-4d4a-9f78-a1d6a7c0d814" containerName="mariadb-database-create" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.360001 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="484190c6-e084-4d4a-9f78-a1d6a7c0d814" containerName="mariadb-database-create" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.360202 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="484190c6-e084-4d4a-9f78-a1d6a7c0d814" containerName="mariadb-database-create" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.360925 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7b9d-account-create-87dh8" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.363199 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.368143 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-7b9d-account-create-87dh8"] Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.441596 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47zw\" (UniqueName: \"kubernetes.io/projected/0ac5b7b7-db12-4364-9220-05368465d505-kube-api-access-g47zw\") pod \"manila-7b9d-account-create-87dh8\" (UID: \"0ac5b7b7-db12-4364-9220-05368465d505\") " pod="openstack/manila-7b9d-account-create-87dh8" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.544302 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g47zw\" (UniqueName: \"kubernetes.io/projected/0ac5b7b7-db12-4364-9220-05368465d505-kube-api-access-g47zw\") pod \"manila-7b9d-account-create-87dh8\" (UID: \"0ac5b7b7-db12-4364-9220-05368465d505\") " pod="openstack/manila-7b9d-account-create-87dh8" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.563120 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47zw\" (UniqueName: \"kubernetes.io/projected/0ac5b7b7-db12-4364-9220-05368465d505-kube-api-access-g47zw\") pod \"manila-7b9d-account-create-87dh8\" (UID: \"0ac5b7b7-db12-4364-9220-05368465d505\") " pod="openstack/manila-7b9d-account-create-87dh8" Oct 03 14:39:18 crc kubenswrapper[4962]: I1003 14:39:18.685178 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7b9d-account-create-87dh8" Oct 03 14:39:19 crc kubenswrapper[4962]: I1003 14:39:19.168800 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-7b9d-account-create-87dh8"] Oct 03 14:39:19 crc kubenswrapper[4962]: I1003 14:39:19.789557 4962 generic.go:334] "Generic (PLEG): container finished" podID="0ac5b7b7-db12-4364-9220-05368465d505" containerID="50365993e76f92599473046ebfd35ae09eecadcc6a3f03181b68f0072f91f5f7" exitCode=0 Oct 03 14:39:19 crc kubenswrapper[4962]: I1003 14:39:19.789620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7b9d-account-create-87dh8" event={"ID":"0ac5b7b7-db12-4364-9220-05368465d505","Type":"ContainerDied","Data":"50365993e76f92599473046ebfd35ae09eecadcc6a3f03181b68f0072f91f5f7"} Oct 03 14:39:19 crc kubenswrapper[4962]: I1003 14:39:19.789925 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7b9d-account-create-87dh8" event={"ID":"0ac5b7b7-db12-4364-9220-05368465d505","Type":"ContainerStarted","Data":"3d47620fc90ca22d0d27445f71e546cf329b0510ae926414b578ff75bbd9d17f"} Oct 03 14:39:21 crc kubenswrapper[4962]: I1003 14:39:21.244664 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7b9d-account-create-87dh8" Oct 03 14:39:21 crc kubenswrapper[4962]: I1003 14:39:21.398536 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g47zw\" (UniqueName: \"kubernetes.io/projected/0ac5b7b7-db12-4364-9220-05368465d505-kube-api-access-g47zw\") pod \"0ac5b7b7-db12-4364-9220-05368465d505\" (UID: \"0ac5b7b7-db12-4364-9220-05368465d505\") " Oct 03 14:39:21 crc kubenswrapper[4962]: I1003 14:39:21.406549 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac5b7b7-db12-4364-9220-05368465d505-kube-api-access-g47zw" (OuterVolumeSpecName: "kube-api-access-g47zw") pod "0ac5b7b7-db12-4364-9220-05368465d505" (UID: "0ac5b7b7-db12-4364-9220-05368465d505"). InnerVolumeSpecName "kube-api-access-g47zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:39:21 crc kubenswrapper[4962]: I1003 14:39:21.501043 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g47zw\" (UniqueName: \"kubernetes.io/projected/0ac5b7b7-db12-4364-9220-05368465d505-kube-api-access-g47zw\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:21 crc kubenswrapper[4962]: I1003 14:39:21.820590 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7b9d-account-create-87dh8" event={"ID":"0ac5b7b7-db12-4364-9220-05368465d505","Type":"ContainerDied","Data":"3d47620fc90ca22d0d27445f71e546cf329b0510ae926414b578ff75bbd9d17f"} Oct 03 14:39:21 crc kubenswrapper[4962]: I1003 14:39:21.820711 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d47620fc90ca22d0d27445f71e546cf329b0510ae926414b578ff75bbd9d17f" Oct 03 14:39:21 crc kubenswrapper[4962]: I1003 14:39:21.820715 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7b9d-account-create-87dh8" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.227817 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:39:23 crc kubenswrapper[4962]: E1003 14:39:23.228574 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.609542 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-sggtq"] Oct 03 14:39:23 crc kubenswrapper[4962]: E1003 14:39:23.610034 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac5b7b7-db12-4364-9220-05368465d505" containerName="mariadb-account-create" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.610054 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac5b7b7-db12-4364-9220-05368465d505" containerName="mariadb-account-create" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.610230 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac5b7b7-db12-4364-9220-05368465d505" containerName="mariadb-account-create" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.611036 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.614998 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.617470 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-86jzd" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.623136 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-sggtq"] Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.656427 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-combined-ca-bundle\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.656587 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-config-data\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.656682 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-job-config-data\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.656728 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thljs\" (UniqueName: \"kubernetes.io/projected/b183145f-a6ac-4f76-ac84-5c237e065e37-kube-api-access-thljs\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.757899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-combined-ca-bundle\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.758023 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-config-data\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.758081 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-job-config-data\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.758118 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thljs\" (UniqueName: \"kubernetes.io/projected/b183145f-a6ac-4f76-ac84-5c237e065e37-kube-api-access-thljs\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.763815 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-config-data\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.767295 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-job-config-data\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.767462 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-combined-ca-bundle\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.774399 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thljs\" (UniqueName: \"kubernetes.io/projected/b183145f-a6ac-4f76-ac84-5c237e065e37-kube-api-access-thljs\") pod \"manila-db-sync-sggtq\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:23 crc kubenswrapper[4962]: I1003 14:39:23.937231 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:24 crc kubenswrapper[4962]: I1003 14:39:24.879602 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-sggtq"] Oct 03 14:39:25 crc kubenswrapper[4962]: I1003 14:39:25.860747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-sggtq" event={"ID":"b183145f-a6ac-4f76-ac84-5c237e065e37","Type":"ContainerStarted","Data":"cca0a1450ff91f44cab78f97503ad5299bdf51c91c56930ba3eb343b58314189"} Oct 03 14:39:27 crc kubenswrapper[4962]: I1003 14:39:27.920412 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 14:39:32 crc kubenswrapper[4962]: I1003 14:39:32.927578 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-sggtq" event={"ID":"b183145f-a6ac-4f76-ac84-5c237e065e37","Type":"ContainerStarted","Data":"e3c408a217b48e113de3b5b561824c4222df905dae4b8334dce68e1ff9cc44c2"} Oct 03 14:39:32 crc kubenswrapper[4962]: I1003 14:39:32.953472 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-sggtq" podStartSLOduration=2.801604483 podStartE2EDuration="9.95345747s" podCreationTimestamp="2025-10-03 14:39:23 +0000 UTC" firstStartedPulling="2025-10-03 14:39:24.883679955 +0000 UTC m=+6573.287577790" lastFinishedPulling="2025-10-03 14:39:32.035532902 +0000 UTC m=+6580.439430777" observedRunningTime="2025-10-03 14:39:32.952034962 +0000 UTC m=+6581.355932797" watchObservedRunningTime="2025-10-03 14:39:32.95345747 +0000 UTC m=+6581.357355305" Oct 03 14:39:34 crc kubenswrapper[4962]: I1003 14:39:34.948599 4962 generic.go:334] "Generic (PLEG): container finished" podID="b183145f-a6ac-4f76-ac84-5c237e065e37" containerID="e3c408a217b48e113de3b5b561824c4222df905dae4b8334dce68e1ff9cc44c2" exitCode=0 Oct 03 14:39:34 crc kubenswrapper[4962]: I1003 14:39:34.948973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-sggtq" event={"ID":"b183145f-a6ac-4f76-ac84-5c237e065e37","Type":"ContainerDied","Data":"e3c408a217b48e113de3b5b561824c4222df905dae4b8334dce68e1ff9cc44c2"} Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.543715 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.713997 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-config-data\") pod \"b183145f-a6ac-4f76-ac84-5c237e065e37\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.714075 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-combined-ca-bundle\") pod \"b183145f-a6ac-4f76-ac84-5c237e065e37\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.714301 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thljs\" (UniqueName: \"kubernetes.io/projected/b183145f-a6ac-4f76-ac84-5c237e065e37-kube-api-access-thljs\") pod \"b183145f-a6ac-4f76-ac84-5c237e065e37\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.714429 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-job-config-data\") pod \"b183145f-a6ac-4f76-ac84-5c237e065e37\" (UID: \"b183145f-a6ac-4f76-ac84-5c237e065e37\") " Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.721776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b183145f-a6ac-4f76-ac84-5c237e065e37-kube-api-access-thljs" (OuterVolumeSpecName: "kube-api-access-thljs") pod "b183145f-a6ac-4f76-ac84-5c237e065e37" (UID: "b183145f-a6ac-4f76-ac84-5c237e065e37"). InnerVolumeSpecName "kube-api-access-thljs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.723146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "b183145f-a6ac-4f76-ac84-5c237e065e37" (UID: "b183145f-a6ac-4f76-ac84-5c237e065e37"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.726620 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-config-data" (OuterVolumeSpecName: "config-data") pod "b183145f-a6ac-4f76-ac84-5c237e065e37" (UID: "b183145f-a6ac-4f76-ac84-5c237e065e37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.745571 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b183145f-a6ac-4f76-ac84-5c237e065e37" (UID: "b183145f-a6ac-4f76-ac84-5c237e065e37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.817865 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thljs\" (UniqueName: \"kubernetes.io/projected/b183145f-a6ac-4f76-ac84-5c237e065e37-kube-api-access-thljs\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.817919 4962 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.817940 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.817952 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183145f-a6ac-4f76-ac84-5c237e065e37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.968547 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-sggtq" event={"ID":"b183145f-a6ac-4f76-ac84-5c237e065e37","Type":"ContainerDied","Data":"cca0a1450ff91f44cab78f97503ad5299bdf51c91c56930ba3eb343b58314189"} Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.968580 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-sggtq" Oct 03 14:39:36 crc kubenswrapper[4962]: I1003 14:39:36.968582 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca0a1450ff91f44cab78f97503ad5299bdf51c91c56930ba3eb343b58314189" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.228675 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.228940 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:39:37 crc kubenswrapper[4962]: E1003 14:39:37.229240 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b183145f-a6ac-4f76-ac84-5c237e065e37" containerName="manila-db-sync" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.229258 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b183145f-a6ac-4f76-ac84-5c237e065e37" containerName="manila-db-sync" Oct 03 14:39:37 crc kubenswrapper[4962]: E1003 14:39:37.229255 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.229503 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b183145f-a6ac-4f76-ac84-5c237e065e37" containerName="manila-db-sync" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.230659 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.239893 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-86jzd" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.240193 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.240272 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.240526 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.259072 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.309751 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.311803 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.314732 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.332851 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.430805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-config-data\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.430849 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.430876 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c57bd1b-33d0-464b-ac4e-204e90cebd48-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.430918 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lzz\" (UniqueName: \"kubernetes.io/projected/1b5b9fea-e568-4955-ae40-e3fb2cc52743-kube-api-access-m5lzz\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.430970 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b5b9fea-e568-4955-ae40-e3fb2cc52743-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.430988 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-config-data\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.431013 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d6bn\" (UniqueName: \"kubernetes.io/projected/1c57bd1b-33d0-464b-ac4e-204e90cebd48-kube-api-access-7d6bn\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.431028 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.431051 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1b5b9fea-e568-4955-ae40-e3fb2cc52743-ceph\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.431090 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.431107 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-scripts\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.431130 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1b5b9fea-e568-4955-ae40-e3fb2cc52743-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.431145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.431166 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-scripts\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.436844 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b74d5677c-vqlcm"] Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.438656 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.464963 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b74d5677c-vqlcm"] Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.535845 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.535891 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.535917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b5b9fea-e568-4955-ae40-e3fb2cc52743-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.535942 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-config-data\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.535974 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d6bn\" (UniqueName: \"kubernetes.io/projected/1c57bd1b-33d0-464b-ac4e-204e90cebd48-kube-api-access-7d6bn\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.535992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536017 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1b5b9fea-e568-4955-ae40-e3fb2cc52743-ceph\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536059 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536077 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-scripts\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536104 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1b5b9fea-e568-4955-ae40-e3fb2cc52743-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536121 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-scripts\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536191 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-config-data\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536210 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536230 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c57bd1b-33d0-464b-ac4e-204e90cebd48-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536257 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-dns-svc\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lzz\" (UniqueName: \"kubernetes.io/projected/1b5b9fea-e568-4955-ae40-e3fb2cc52743-kube-api-access-m5lzz\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536313 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-config\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536344 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqmj\" (UniqueName: \"kubernetes.io/projected/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-kube-api-access-wvqmj\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.536441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b5b9fea-e568-4955-ae40-e3fb2cc52743-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.541755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c57bd1b-33d0-464b-ac4e-204e90cebd48-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.553713 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1b5b9fea-e568-4955-ae40-e3fb2cc52743-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.554884 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.556164 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.562161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.566139 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-config-data\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.566552 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-config-data\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.570088 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-scripts\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.570406 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1b5b9fea-e568-4955-ae40-e3fb2cc52743-ceph\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.570531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c57bd1b-33d0-464b-ac4e-204e90cebd48-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.573059 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5b9fea-e568-4955-ae40-e3fb2cc52743-scripts\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.584267 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lzz\" (UniqueName: \"kubernetes.io/projected/1b5b9fea-e568-4955-ae40-e3fb2cc52743-kube-api-access-m5lzz\") pod \"manila-share-share1-0\" (UID: \"1b5b9fea-e568-4955-ae40-e3fb2cc52743\") " pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.585211 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d6bn\" (UniqueName: \"kubernetes.io/projected/1c57bd1b-33d0-464b-ac4e-204e90cebd48-kube-api-access-7d6bn\") pod \"manila-scheduler-0\" (UID: \"1c57bd1b-33d0-464b-ac4e-204e90cebd48\") " pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.643511 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.643936 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-config\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.644005 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqmj\" (UniqueName: \"kubernetes.io/projected/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-kube-api-access-wvqmj\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.644024 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.644039 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.644192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-dns-svc\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.645426 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-nb\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.645878 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-config\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.645951 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-dns-svc\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.647266 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-sb\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.693169 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqmj\" (UniqueName: \"kubernetes.io/projected/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-kube-api-access-wvqmj\") pod \"dnsmasq-dns-5b74d5677c-vqlcm\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.698087 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.701791 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.712322 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.742158 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.780260 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.851807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k8bf\" (UniqueName: \"kubernetes.io/projected/46631f63-74e6-4de8-9b05-38d7e110e604-kube-api-access-2k8bf\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.851965 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46631f63-74e6-4de8-9b05-38d7e110e604-etc-machine-id\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.852290 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-config-data-custom\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.852391 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.852414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-scripts\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.852530 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-config-data\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.852556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46631f63-74e6-4de8-9b05-38d7e110e604-logs\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.867339 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.954324 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-config-data-custom\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.954371 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.954389 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-scripts\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.954464 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-config-data\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.954484 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46631f63-74e6-4de8-9b05-38d7e110e604-logs\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.954543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k8bf\" (UniqueName: \"kubernetes.io/projected/46631f63-74e6-4de8-9b05-38d7e110e604-kube-api-access-2k8bf\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.954604 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46631f63-74e6-4de8-9b05-38d7e110e604-etc-machine-id\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.954784 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46631f63-74e6-4de8-9b05-38d7e110e604-etc-machine-id\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.958077 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46631f63-74e6-4de8-9b05-38d7e110e604-logs\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.974469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-scripts\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.974490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.975574 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-config-data-custom\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.980034 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k8bf\" (UniqueName: \"kubernetes.io/projected/46631f63-74e6-4de8-9b05-38d7e110e604-kube-api-access-2k8bf\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:37 crc kubenswrapper[4962]: I1003 14:39:37.980936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46631f63-74e6-4de8-9b05-38d7e110e604-config-data\") pod \"manila-api-0\" (UID: \"46631f63-74e6-4de8-9b05-38d7e110e604\") " pod="openstack/manila-api-0" Oct 03 14:39:38 crc kubenswrapper[4962]: I1003 14:39:38.073567 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 03 14:39:38 crc kubenswrapper[4962]: I1003 14:39:38.406659 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 03 14:39:38 crc kubenswrapper[4962]: I1003 14:39:38.503195 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b74d5677c-vqlcm"] Oct 03 14:39:38 crc kubenswrapper[4962]: W1003 14:39:38.509102 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bcf41e_bc2f_4f59_b0ca_a1ff18fc8805.slice/crio-8b614a8b851786c9014f6e9fdbe0de4bb9c53ebacbccc6d5fe1b8620484280ae WatchSource:0}: Error finding container 8b614a8b851786c9014f6e9fdbe0de4bb9c53ebacbccc6d5fe1b8620484280ae: Status 404 returned error can't find the container with id 8b614a8b851786c9014f6e9fdbe0de4bb9c53ebacbccc6d5fe1b8620484280ae Oct 03 14:39:38 crc kubenswrapper[4962]: I1003 14:39:38.616344 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 03 14:39:38 crc kubenswrapper[4962]: I1003 14:39:38.877776 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 03 14:39:38 crc kubenswrapper[4962]: W1003 14:39:38.905506 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46631f63_74e6_4de8_9b05_38d7e110e604.slice/crio-593b5d853575c054f833246dc22fe2802619268d28e4fb27b97589b099b58c7e WatchSource:0}: Error finding container 593b5d853575c054f833246dc22fe2802619268d28e4fb27b97589b099b58c7e: Status 404 returned error can't find the container with id 593b5d853575c054f833246dc22fe2802619268d28e4fb27b97589b099b58c7e Oct 03 14:39:39 crc kubenswrapper[4962]: I1003 14:39:39.053566 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1b5b9fea-e568-4955-ae40-e3fb2cc52743","Type":"ContainerStarted","Data":"cae3cbe0a58b06331313812bd91f945b6a9247e3125c1c409810db4d74f8d987"} Oct 03 14:39:39 crc kubenswrapper[4962]: I1003 14:39:39.060800 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"46631f63-74e6-4de8-9b05-38d7e110e604","Type":"ContainerStarted","Data":"593b5d853575c054f833246dc22fe2802619268d28e4fb27b97589b099b58c7e"} Oct 03 14:39:39 crc kubenswrapper[4962]: I1003 14:39:39.062238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1c57bd1b-33d0-464b-ac4e-204e90cebd48","Type":"ContainerStarted","Data":"814702591f4eaa79d7c968a64b8698e5c54249ff51bf89a06a4a9f0133d1c169"} Oct 03 14:39:39 crc kubenswrapper[4962]: I1003 14:39:39.064580 4962 generic.go:334] "Generic (PLEG): container finished" podID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" containerID="5d64a2a476ff200465bd6d1813460740ea54d4cff4bb5359082057a36b430d68" exitCode=0 Oct 03 14:39:39 crc kubenswrapper[4962]: I1003 14:39:39.064652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" event={"ID":"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805","Type":"ContainerDied","Data":"5d64a2a476ff200465bd6d1813460740ea54d4cff4bb5359082057a36b430d68"} Oct 03 14:39:39 crc kubenswrapper[4962]: I1003 14:39:39.064687 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" event={"ID":"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805","Type":"ContainerStarted","Data":"8b614a8b851786c9014f6e9fdbe0de4bb9c53ebacbccc6d5fe1b8620484280ae"} Oct 03 14:39:40 crc kubenswrapper[4962]: I1003 14:39:40.081834 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"46631f63-74e6-4de8-9b05-38d7e110e604","Type":"ContainerStarted","Data":"48fbab8c32a8dd99ee365b9a4434bb9c896016c311086babaaf27b2089e83858"} Oct 03 14:39:40 crc kubenswrapper[4962]: I1003 14:39:40.082392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"46631f63-74e6-4de8-9b05-38d7e110e604","Type":"ContainerStarted","Data":"4a39f93382ba364d62b0e7a1b9b990153e0c2937e70dfa19fe610db405933fc0"} Oct 03 14:39:40 crc kubenswrapper[4962]: I1003 14:39:40.082462 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 03 14:39:40 crc kubenswrapper[4962]: I1003 14:39:40.088098 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" event={"ID":"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805","Type":"ContainerStarted","Data":"8a11b24b46677cc27fe9bbf875625c223be5cb61c5819ecc86fb66eb9b50c14d"} Oct 03 14:39:40 crc kubenswrapper[4962]: I1003 14:39:40.088323 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:40 crc kubenswrapper[4962]: I1003 14:39:40.133472 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.133456588 podStartE2EDuration="3.133456588s" podCreationTimestamp="2025-10-03 14:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:39:40.110939087 +0000 UTC m=+6588.514836932" watchObservedRunningTime="2025-10-03 14:39:40.133456588 +0000 UTC m=+6588.537354413" Oct 03 14:39:40 crc kubenswrapper[4962]: I1003 14:39:40.135949 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" podStartSLOduration=3.135940084 podStartE2EDuration="3.135940084s" podCreationTimestamp="2025-10-03 14:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:39:40.128272569 +0000 UTC m=+6588.532170424" watchObservedRunningTime="2025-10-03 14:39:40.135940084 +0000 UTC m=+6588.539837919" Oct 03 14:39:41 crc kubenswrapper[4962]: I1003 14:39:41.099118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1c57bd1b-33d0-464b-ac4e-204e90cebd48","Type":"ContainerStarted","Data":"c0fbd6ec506a37a29a3f9e25c4515d8ded2c94aa3f93577f067f86c3e6876f3c"} Oct 03 14:39:41 crc kubenswrapper[4962]: I1003 14:39:41.099696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1c57bd1b-33d0-464b-ac4e-204e90cebd48","Type":"ContainerStarted","Data":"8c63a2408ceec71cc6b0d981ba44af52a38ab032338c231c6e53017754a3ea6f"} Oct 03 14:39:41 crc kubenswrapper[4962]: I1003 14:39:41.132939 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.858202705 podStartE2EDuration="4.132898111s" podCreationTimestamp="2025-10-03 14:39:37 +0000 UTC" firstStartedPulling="2025-10-03 14:39:38.63715358 +0000 UTC m=+6587.041051415" lastFinishedPulling="2025-10-03 14:39:39.911848976 +0000 UTC m=+6588.315746821" observedRunningTime="2025-10-03 14:39:41.115681282 +0000 UTC m=+6589.519579137" watchObservedRunningTime="2025-10-03 14:39:41.132898111 +0000 UTC m=+6589.536795946" Oct 03 14:39:47 crc kubenswrapper[4962]: I1003 14:39:47.783587 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:39:47 crc kubenswrapper[4962]: I1003 14:39:47.854094 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699498fcb9-wgjvx"] Oct 03 14:39:47 crc kubenswrapper[4962]: I1003 14:39:47.854359 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" podUID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerName="dnsmasq-dns" containerID="cri-o://fcda40a21a7008ba001fad062163bb3582f05f2d8f2afe32740fb8b3fc040fe3" gracePeriod=10 Oct 03 14:39:47 crc kubenswrapper[4962]: I1003 14:39:47.868223 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 03 14:39:48 crc kubenswrapper[4962]: I1003 14:39:48.160965 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" podUID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.77:5353: connect: connection refused" Oct 03 14:39:48 crc kubenswrapper[4962]: I1003 14:39:48.186184 4962 generic.go:334] "Generic (PLEG): container finished" podID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerID="fcda40a21a7008ba001fad062163bb3582f05f2d8f2afe32740fb8b3fc040fe3" exitCode=0 Oct 03 14:39:48 crc kubenswrapper[4962]: I1003 14:39:48.186223 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" event={"ID":"2b7423e1-7cb7-456a-aaf3-011b6795240d","Type":"ContainerDied","Data":"fcda40a21a7008ba001fad062163bb3582f05f2d8f2afe32740fb8b3fc040fe3"} Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.819949 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.878935 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-dns-svc\") pod \"2b7423e1-7cb7-456a-aaf3-011b6795240d\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.878999 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-sb\") pod \"2b7423e1-7cb7-456a-aaf3-011b6795240d\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.879042 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jvhd\" (UniqueName: \"kubernetes.io/projected/2b7423e1-7cb7-456a-aaf3-011b6795240d-kube-api-access-4jvhd\") pod \"2b7423e1-7cb7-456a-aaf3-011b6795240d\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.879113 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-nb\") pod \"2b7423e1-7cb7-456a-aaf3-011b6795240d\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.879159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-config\") pod \"2b7423e1-7cb7-456a-aaf3-011b6795240d\" (UID: \"2b7423e1-7cb7-456a-aaf3-011b6795240d\") " Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.890578 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7423e1-7cb7-456a-aaf3-011b6795240d-kube-api-access-4jvhd" (OuterVolumeSpecName: "kube-api-access-4jvhd") pod "2b7423e1-7cb7-456a-aaf3-011b6795240d" (UID: "2b7423e1-7cb7-456a-aaf3-011b6795240d"). InnerVolumeSpecName "kube-api-access-4jvhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.932657 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.932914 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="ceilometer-central-agent" containerID="cri-o://afe94f2aeba3732068aedd5205f7c7a8b1466f92c8bdc8cf72d7bb2b64cf689a" gracePeriod=30 Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.933317 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="proxy-httpd" containerID="cri-o://25cc75d6e4777cdc689c7d8f33cf836d01709650997197823544bd89ff33e926" gracePeriod=30 Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.933364 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="sg-core" containerID="cri-o://58a18586e670b7ad5d11b44daf34e3b66e275de56df9250ec49de92f0cc28b5d" gracePeriod=30 Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.933401 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="ceilometer-notification-agent" containerID="cri-o://764d02f035315964298f451338fe9be9a5741d3ca019716d964e895a8822b49b" gracePeriod=30 Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.965573 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b7423e1-7cb7-456a-aaf3-011b6795240d" (UID: "2b7423e1-7cb7-456a-aaf3-011b6795240d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.965678 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b7423e1-7cb7-456a-aaf3-011b6795240d" (UID: "2b7423e1-7cb7-456a-aaf3-011b6795240d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.967306 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-config" (OuterVolumeSpecName: "config") pod "2b7423e1-7cb7-456a-aaf3-011b6795240d" (UID: "2b7423e1-7cb7-456a-aaf3-011b6795240d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.981035 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.981067 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.981078 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jvhd\" (UniqueName: \"kubernetes.io/projected/2b7423e1-7cb7-456a-aaf3-011b6795240d-kube-api-access-4jvhd\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.981091 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:50 crc kubenswrapper[4962]: I1003 14:39:50.988392 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b7423e1-7cb7-456a-aaf3-011b6795240d" (UID: "2b7423e1-7cb7-456a-aaf3-011b6795240d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.081911 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7423e1-7cb7-456a-aaf3-011b6795240d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.221018 4962 generic.go:334] "Generic (PLEG): container finished" podID="70003b92-6321-4e11-97b5-f25362d6d29d" containerID="25cc75d6e4777cdc689c7d8f33cf836d01709650997197823544bd89ff33e926" exitCode=0 Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.221047 4962 generic.go:334] "Generic (PLEG): container finished" podID="70003b92-6321-4e11-97b5-f25362d6d29d" containerID="58a18586e670b7ad5d11b44daf34e3b66e275de56df9250ec49de92f0cc28b5d" exitCode=2 Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.221053 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerDied","Data":"25cc75d6e4777cdc689c7d8f33cf836d01709650997197823544bd89ff33e926"} Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.221090 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerDied","Data":"58a18586e670b7ad5d11b44daf34e3b66e275de56df9250ec49de92f0cc28b5d"} Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.222803 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1b5b9fea-e568-4955-ae40-e3fb2cc52743","Type":"ContainerStarted","Data":"63c64fb2e2e8d29ff54c0544243b599d4432c94419688f4d7c06a6a168013c90"} Oct 03 14:39:51 crc kubenswrapper[4962]: E1003 14:39:51.223022 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70003b92_6321_4e11_97b5_f25362d6d29d.slice/crio-58a18586e670b7ad5d11b44daf34e3b66e275de56df9250ec49de92f0cc28b5d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70003b92_6321_4e11_97b5_f25362d6d29d.slice/crio-conmon-25cc75d6e4777cdc689c7d8f33cf836d01709650997197823544bd89ff33e926.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70003b92_6321_4e11_97b5_f25362d6d29d.slice/crio-conmon-58a18586e670b7ad5d11b44daf34e3b66e275de56df9250ec49de92f0cc28b5d.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.224668 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" event={"ID":"2b7423e1-7cb7-456a-aaf3-011b6795240d","Type":"ContainerDied","Data":"0af41f5ee0742003c6de5107c24c3f1c2113762fab195694cbd2b2221f97e14b"} Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.224712 4962 scope.go:117] "RemoveContainer" containerID="fcda40a21a7008ba001fad062163bb3582f05f2d8f2afe32740fb8b3fc040fe3" Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.224839 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699498fcb9-wgjvx" Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.274588 4962 scope.go:117] "RemoveContainer" containerID="bcadc39a9449b21064b23ef1501c63a320c3d5fcbea108fd322458cafb40e85d" Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.279352 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699498fcb9-wgjvx"] Oct 03 14:39:51 crc kubenswrapper[4962]: I1003 14:39:51.287209 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699498fcb9-wgjvx"] Oct 03 14:39:52 crc kubenswrapper[4962]: I1003 14:39:52.236578 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:39:52 crc kubenswrapper[4962]: E1003 14:39:52.237470 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:39:52 crc kubenswrapper[4962]: I1003 14:39:52.244550 4962 generic.go:334] "Generic (PLEG): container finished" podID="70003b92-6321-4e11-97b5-f25362d6d29d" containerID="afe94f2aeba3732068aedd5205f7c7a8b1466f92c8bdc8cf72d7bb2b64cf689a" exitCode=0 Oct 03 14:39:52 crc kubenswrapper[4962]: I1003 14:39:52.245209 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7423e1-7cb7-456a-aaf3-011b6795240d" path="/var/lib/kubelet/pods/2b7423e1-7cb7-456a-aaf3-011b6795240d/volumes" Oct 03 14:39:52 crc kubenswrapper[4962]: I1003 14:39:52.247287 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerDied","Data":"afe94f2aeba3732068aedd5205f7c7a8b1466f92c8bdc8cf72d7bb2b64cf689a"} Oct 03 14:39:52 crc kubenswrapper[4962]: I1003 14:39:52.247319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1b5b9fea-e568-4955-ae40-e3fb2cc52743","Type":"ContainerStarted","Data":"7974fb04c3421d26b36a8a034ceeae7102c2ce9f7385180c8c12097fdfe1339d"} Oct 03 14:39:52 crc kubenswrapper[4962]: I1003 14:39:52.269956 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.328358718 podStartE2EDuration="15.269939034s" podCreationTimestamp="2025-10-03 14:39:37 +0000 UTC" firstStartedPulling="2025-10-03 14:39:38.417842179 +0000 UTC m=+6586.821740014" lastFinishedPulling="2025-10-03 14:39:50.359422495 +0000 UTC m=+6598.763320330" observedRunningTime="2025-10-03 14:39:52.265757963 +0000 UTC m=+6600.669655818" watchObservedRunningTime="2025-10-03 14:39:52.269939034 +0000 UTC m=+6600.673836869" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.373135 4962 generic.go:334] "Generic (PLEG): container finished" podID="70003b92-6321-4e11-97b5-f25362d6d29d" containerID="764d02f035315964298f451338fe9be9a5741d3ca019716d964e895a8822b49b" exitCode=0 Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.373792 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerDied","Data":"764d02f035315964298f451338fe9be9a5741d3ca019716d964e895a8822b49b"} Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.583471 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.744110 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-log-httpd\") pod \"70003b92-6321-4e11-97b5-f25362d6d29d\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.744171 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-run-httpd\") pod \"70003b92-6321-4e11-97b5-f25362d6d29d\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.744307 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-config-data\") pod \"70003b92-6321-4e11-97b5-f25362d6d29d\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.744358 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-sg-core-conf-yaml\") pod \"70003b92-6321-4e11-97b5-f25362d6d29d\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.744384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-combined-ca-bundle\") pod \"70003b92-6321-4e11-97b5-f25362d6d29d\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.744422 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw9tr\" (UniqueName: \"kubernetes.io/projected/70003b92-6321-4e11-97b5-f25362d6d29d-kube-api-access-fw9tr\") pod \"70003b92-6321-4e11-97b5-f25362d6d29d\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.744457 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-scripts\") pod \"70003b92-6321-4e11-97b5-f25362d6d29d\" (UID: \"70003b92-6321-4e11-97b5-f25362d6d29d\") " Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.744559 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70003b92-6321-4e11-97b5-f25362d6d29d" (UID: "70003b92-6321-4e11-97b5-f25362d6d29d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.745384 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70003b92-6321-4e11-97b5-f25362d6d29d" (UID: "70003b92-6321-4e11-97b5-f25362d6d29d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.745519 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.752011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70003b92-6321-4e11-97b5-f25362d6d29d-kube-api-access-fw9tr" (OuterVolumeSpecName: "kube-api-access-fw9tr") pod "70003b92-6321-4e11-97b5-f25362d6d29d" (UID: "70003b92-6321-4e11-97b5-f25362d6d29d"). InnerVolumeSpecName "kube-api-access-fw9tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.753597 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-scripts" (OuterVolumeSpecName: "scripts") pod "70003b92-6321-4e11-97b5-f25362d6d29d" (UID: "70003b92-6321-4e11-97b5-f25362d6d29d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.774068 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70003b92-6321-4e11-97b5-f25362d6d29d" (UID: "70003b92-6321-4e11-97b5-f25362d6d29d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.833509 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70003b92-6321-4e11-97b5-f25362d6d29d" (UID: "70003b92-6321-4e11-97b5-f25362d6d29d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.848165 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70003b92-6321-4e11-97b5-f25362d6d29d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.848205 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.848219 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.848233 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw9tr\" (UniqueName: \"kubernetes.io/projected/70003b92-6321-4e11-97b5-f25362d6d29d-kube-api-access-fw9tr\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.848247 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.870502 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-config-data" (OuterVolumeSpecName: "config-data") pod "70003b92-6321-4e11-97b5-f25362d6d29d" (UID: "70003b92-6321-4e11-97b5-f25362d6d29d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:53 crc kubenswrapper[4962]: I1003 14:39:53.950620 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70003b92-6321-4e11-97b5-f25362d6d29d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.385655 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70003b92-6321-4e11-97b5-f25362d6d29d","Type":"ContainerDied","Data":"efd9b883cc08d6a0b1ac834e41ebaac40b8406dc744a270b082ece8dd3785185"} Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.386729 4962 scope.go:117] "RemoveContainer" containerID="25cc75d6e4777cdc689c7d8f33cf836d01709650997197823544bd89ff33e926" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.385734 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.412357 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.421491 4962 scope.go:117] "RemoveContainer" containerID="58a18586e670b7ad5d11b44daf34e3b66e275de56df9250ec49de92f0cc28b5d" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.425880 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.445686 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:39:54 crc kubenswrapper[4962]: E1003 14:39:54.446190 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="proxy-httpd" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.446215 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="proxy-httpd" Oct 03 14:39:54 crc kubenswrapper[4962]: E1003 14:39:54.446227 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="sg-core" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.446233 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="sg-core" Oct 03 14:39:54 crc kubenswrapper[4962]: E1003 14:39:54.446248 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerName="init" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.446254 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerName="init" Oct 03 14:39:54 crc kubenswrapper[4962]: E1003 14:39:54.446279 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="ceilometer-central-agent" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.446284 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="ceilometer-central-agent" Oct 03 14:39:54 crc kubenswrapper[4962]: E1003 14:39:54.446305 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="ceilometer-notification-agent" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.446311 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="ceilometer-notification-agent" Oct 03 14:39:54 crc kubenswrapper[4962]: E1003 14:39:54.446325 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerName="dnsmasq-dns" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.446331 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerName="dnsmasq-dns" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.447712 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="ceilometer-notification-agent" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.447756 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="proxy-httpd" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.447768 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="ceilometer-central-agent" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.447790 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" containerName="sg-core" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.447807 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7423e1-7cb7-456a-aaf3-011b6795240d" containerName="dnsmasq-dns" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.449800 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.454007 4962 scope.go:117] "RemoveContainer" containerID="764d02f035315964298f451338fe9be9a5741d3ca019716d964e895a8822b49b" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.455780 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.456152 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.465048 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.490827 4962 scope.go:117] "RemoveContainer" containerID="afe94f2aeba3732068aedd5205f7c7a8b1466f92c8bdc8cf72d7bb2b64cf689a" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.567015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xzgf\" (UniqueName: \"kubernetes.io/projected/e4905874-9e48-44e5-9c3d-e5e10844a4b2-kube-api-access-2xzgf\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.567060 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-scripts\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.567471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4905874-9e48-44e5-9c3d-e5e10844a4b2-log-httpd\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.568018 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.568065 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4905874-9e48-44e5-9c3d-e5e10844a4b2-run-httpd\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.568083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-config-data\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.568153 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.669475 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4905874-9e48-44e5-9c3d-e5e10844a4b2-log-httpd\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.669577 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.669628 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4905874-9e48-44e5-9c3d-e5e10844a4b2-run-httpd\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.669660 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-config-data\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.669710 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.669761 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xzgf\" (UniqueName: \"kubernetes.io/projected/e4905874-9e48-44e5-9c3d-e5e10844a4b2-kube-api-access-2xzgf\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.669784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-scripts\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.670161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4905874-9e48-44e5-9c3d-e5e10844a4b2-run-httpd\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.670434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4905874-9e48-44e5-9c3d-e5e10844a4b2-log-httpd\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.674261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.674867 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.675332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-config-data\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.675340 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4905874-9e48-44e5-9c3d-e5e10844a4b2-scripts\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.691176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xzgf\" (UniqueName: \"kubernetes.io/projected/e4905874-9e48-44e5-9c3d-e5e10844a4b2-kube-api-access-2xzgf\") pod \"ceilometer-0\" (UID: \"e4905874-9e48-44e5-9c3d-e5e10844a4b2\") " pod="openstack/ceilometer-0" Oct 03 14:39:54 crc kubenswrapper[4962]: I1003 14:39:54.776002 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:39:55 crc kubenswrapper[4962]: I1003 14:39:55.258179 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:39:55 crc kubenswrapper[4962]: W1003 14:39:55.262444 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4905874_9e48_44e5_9c3d_e5e10844a4b2.slice/crio-bae2cf3083575e6ee5204c277a1ad75bcc39185e696724bf900b6883c4dcb38e WatchSource:0}: Error finding container bae2cf3083575e6ee5204c277a1ad75bcc39185e696724bf900b6883c4dcb38e: Status 404 returned error can't find the container with id bae2cf3083575e6ee5204c277a1ad75bcc39185e696724bf900b6883c4dcb38e Oct 03 14:39:55 crc kubenswrapper[4962]: I1003 14:39:55.264890 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:39:55 crc kubenswrapper[4962]: I1003 14:39:55.396301 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4905874-9e48-44e5-9c3d-e5e10844a4b2","Type":"ContainerStarted","Data":"bae2cf3083575e6ee5204c277a1ad75bcc39185e696724bf900b6883c4dcb38e"} Oct 03 14:39:56 crc kubenswrapper[4962]: I1003 14:39:56.248904 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70003b92-6321-4e11-97b5-f25362d6d29d" path="/var/lib/kubelet/pods/70003b92-6321-4e11-97b5-f25362d6d29d/volumes" Oct 03 14:39:56 crc kubenswrapper[4962]: I1003 14:39:56.416969 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4905874-9e48-44e5-9c3d-e5e10844a4b2","Type":"ContainerStarted","Data":"83e96e2fcd5d6b2296325480da2b41920c0e06f27d901d1806fb7a435716e97c"} Oct 03 14:39:57 crc kubenswrapper[4962]: I1003 14:39:57.435745 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4905874-9e48-44e5-9c3d-e5e10844a4b2","Type":"ContainerStarted","Data":"60224ad7ddf9f3380cae8bb213c5509237f8f16c9369869b33085a036fb17299"} Oct 03 14:39:57 crc kubenswrapper[4962]: I1003 14:39:57.436171 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4905874-9e48-44e5-9c3d-e5e10844a4b2","Type":"ContainerStarted","Data":"e9b97069c53d7ebd68c738282fa3ced5cf79c0d989f38f1f63c0ba2883c9ccdc"} Oct 03 14:39:57 crc kubenswrapper[4962]: I1003 14:39:57.643782 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 03 14:39:59 crc kubenswrapper[4962]: I1003 14:39:59.441990 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 03 14:39:59 crc kubenswrapper[4962]: I1003 14:39:59.466415 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 03 14:40:00 crc kubenswrapper[4962]: I1003 14:40:00.464038 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4905874-9e48-44e5-9c3d-e5e10844a4b2","Type":"ContainerStarted","Data":"a10e30d8e4d9e0f774448b7c91a35fd0d63ef5d5f3ee0ab2ed6d786bfe2ab906"} Oct 03 14:40:00 crc kubenswrapper[4962]: I1003 14:40:00.464603 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 14:40:00 crc kubenswrapper[4962]: I1003 14:40:00.495033 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.552476893 podStartE2EDuration="6.495017032s" podCreationTimestamp="2025-10-03 14:39:54 +0000 UTC" firstStartedPulling="2025-10-03 14:39:55.264551944 +0000 UTC m=+6603.668449779" lastFinishedPulling="2025-10-03 14:39:59.207092083 +0000 UTC m=+6607.610989918" observedRunningTime="2025-10-03 14:40:00.485045536 +0000 UTC m=+6608.888943381" watchObservedRunningTime="2025-10-03 14:40:00.495017032 +0000 UTC m=+6608.898914867" Oct 03 14:40:03 crc kubenswrapper[4962]: I1003 14:40:03.227287 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:40:03 crc kubenswrapper[4962]: E1003 14:40:03.227854 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:40:09 crc kubenswrapper[4962]: I1003 14:40:09.236369 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 03 14:40:17 crc kubenswrapper[4962]: I1003 14:40:17.227541 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:40:17 crc kubenswrapper[4962]: E1003 14:40:17.229483 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.349887 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zs6wr"] Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.353237 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.364087 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs6wr"] Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.515118 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-catalog-content\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.515239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlcpq\" (UniqueName: \"kubernetes.io/projected/3a1e2014-a326-494b-b87e-3d102b91652e-kube-api-access-tlcpq\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.515705 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-utilities\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.618166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-utilities\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.618341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-catalog-content\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.618416 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlcpq\" (UniqueName: \"kubernetes.io/projected/3a1e2014-a326-494b-b87e-3d102b91652e-kube-api-access-tlcpq\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.618721 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-utilities\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.618827 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-catalog-content\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.637278 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlcpq\" (UniqueName: \"kubernetes.io/projected/3a1e2014-a326-494b-b87e-3d102b91652e-kube-api-access-tlcpq\") pod \"redhat-marketplace-zs6wr\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:23 crc kubenswrapper[4962]: I1003 14:40:23.723824 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:24 crc kubenswrapper[4962]: I1003 14:40:24.211949 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs6wr"] Oct 03 14:40:24 crc kubenswrapper[4962]: I1003 14:40:24.701659 4962 generic.go:334] "Generic (PLEG): container finished" podID="3a1e2014-a326-494b-b87e-3d102b91652e" containerID="25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612" exitCode=0 Oct 03 14:40:24 crc kubenswrapper[4962]: I1003 14:40:24.701724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs6wr" event={"ID":"3a1e2014-a326-494b-b87e-3d102b91652e","Type":"ContainerDied","Data":"25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612"} Oct 03 14:40:24 crc kubenswrapper[4962]: I1003 14:40:24.701968 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs6wr" event={"ID":"3a1e2014-a326-494b-b87e-3d102b91652e","Type":"ContainerStarted","Data":"d5ce6c4b04136ef8a52988a34b1d79614db3f110216612121092599dc8ce247b"} Oct 03 14:40:24 crc kubenswrapper[4962]: I1003 14:40:24.783827 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 14:40:25 crc kubenswrapper[4962]: I1003 14:40:25.715012 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs6wr" event={"ID":"3a1e2014-a326-494b-b87e-3d102b91652e","Type":"ContainerStarted","Data":"4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e"} Oct 03 14:40:26 crc kubenswrapper[4962]: I1003 14:40:26.724457 4962 generic.go:334] "Generic (PLEG): container finished" podID="3a1e2014-a326-494b-b87e-3d102b91652e" containerID="4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e" exitCode=0 Oct 03 14:40:26 crc kubenswrapper[4962]: I1003 14:40:26.724554 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs6wr" event={"ID":"3a1e2014-a326-494b-b87e-3d102b91652e","Type":"ContainerDied","Data":"4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e"} Oct 03 14:40:27 crc kubenswrapper[4962]: I1003 14:40:27.742270 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs6wr" event={"ID":"3a1e2014-a326-494b-b87e-3d102b91652e","Type":"ContainerStarted","Data":"fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56"} Oct 03 14:40:27 crc kubenswrapper[4962]: I1003 14:40:27.763492 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zs6wr" podStartSLOduration=2.358262492 podStartE2EDuration="4.763475308s" podCreationTimestamp="2025-10-03 14:40:23 +0000 UTC" firstStartedPulling="2025-10-03 14:40:24.703185586 +0000 UTC m=+6633.107083411" lastFinishedPulling="2025-10-03 14:40:27.108398392 +0000 UTC m=+6635.512296227" observedRunningTime="2025-10-03 14:40:27.75752921 +0000 UTC m=+6636.161427095" watchObservedRunningTime="2025-10-03 14:40:27.763475308 +0000 UTC m=+6636.167373133" Oct 03 14:40:32 crc kubenswrapper[4962]: I1003 14:40:32.239116 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:40:32 crc kubenswrapper[4962]: E1003 14:40:32.240174 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:40:33 crc kubenswrapper[4962]: I1003 14:40:33.724240 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:33 crc kubenswrapper[4962]: I1003 14:40:33.724530 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:33 crc kubenswrapper[4962]: I1003 14:40:33.769330 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:33 crc kubenswrapper[4962]: I1003 14:40:33.836549 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:34 crc kubenswrapper[4962]: I1003 14:40:34.138109 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs6wr"] Oct 03 14:40:35 crc kubenswrapper[4962]: I1003 14:40:35.819725 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zs6wr" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" containerName="registry-server" containerID="cri-o://fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56" gracePeriod=2 Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.584692 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.688194 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-utilities\") pod \"3a1e2014-a326-494b-b87e-3d102b91652e\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.688395 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlcpq\" (UniqueName: \"kubernetes.io/projected/3a1e2014-a326-494b-b87e-3d102b91652e-kube-api-access-tlcpq\") pod \"3a1e2014-a326-494b-b87e-3d102b91652e\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.688475 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-catalog-content\") pod \"3a1e2014-a326-494b-b87e-3d102b91652e\" (UID: \"3a1e2014-a326-494b-b87e-3d102b91652e\") " Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.690100 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-utilities" (OuterVolumeSpecName: "utilities") pod "3a1e2014-a326-494b-b87e-3d102b91652e" (UID: "3a1e2014-a326-494b-b87e-3d102b91652e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.693628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a1e2014-a326-494b-b87e-3d102b91652e-kube-api-access-tlcpq" (OuterVolumeSpecName: "kube-api-access-tlcpq") pod "3a1e2014-a326-494b-b87e-3d102b91652e" (UID: "3a1e2014-a326-494b-b87e-3d102b91652e"). InnerVolumeSpecName "kube-api-access-tlcpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.702138 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a1e2014-a326-494b-b87e-3d102b91652e" (UID: "3a1e2014-a326-494b-b87e-3d102b91652e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.790881 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlcpq\" (UniqueName: \"kubernetes.io/projected/3a1e2014-a326-494b-b87e-3d102b91652e-kube-api-access-tlcpq\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.790915 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.790924 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1e2014-a326-494b-b87e-3d102b91652e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.838083 4962 generic.go:334] "Generic (PLEG): container finished" podID="3a1e2014-a326-494b-b87e-3d102b91652e" containerID="fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56" exitCode=0 Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.838132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs6wr" event={"ID":"3a1e2014-a326-494b-b87e-3d102b91652e","Type":"ContainerDied","Data":"fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56"} Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.838162 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs6wr" event={"ID":"3a1e2014-a326-494b-b87e-3d102b91652e","Type":"ContainerDied","Data":"d5ce6c4b04136ef8a52988a34b1d79614db3f110216612121092599dc8ce247b"} Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.838183 4962 scope.go:117] "RemoveContainer" containerID="fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.838190 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs6wr" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.872180 4962 scope.go:117] "RemoveContainer" containerID="4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.885784 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs6wr"] Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.896256 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs6wr"] Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.909174 4962 scope.go:117] "RemoveContainer" containerID="25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.952517 4962 scope.go:117] "RemoveContainer" containerID="fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56" Oct 03 14:40:36 crc kubenswrapper[4962]: E1003 14:40:36.952949 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56\": container with ID starting with fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56 not found: ID does not exist" containerID="fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.953055 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56"} err="failed to get container status \"fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56\": rpc error: code = NotFound desc = could not find container \"fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56\": container with ID starting with fcf2216e1c9cca37ddf342202b01fbc5f071b86c9067c85a4f6b2a83ac48cb56 not found: ID does not exist" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.953181 4962 scope.go:117] "RemoveContainer" containerID="4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e" Oct 03 14:40:36 crc kubenswrapper[4962]: E1003 14:40:36.953692 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e\": container with ID starting with 4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e not found: ID does not exist" containerID="4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.953722 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e"} err="failed to get container status \"4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e\": rpc error: code = NotFound desc = could not find container \"4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e\": container with ID starting with 4a13932ad26672e767dc2d3e99fefec658ae95864b1203fe2be4dd166489de7e not found: ID does not exist" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.953744 4962 scope.go:117] "RemoveContainer" containerID="25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612" Oct 03 14:40:36 crc kubenswrapper[4962]: E1003 14:40:36.954007 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612\": container with ID starting with 25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612 not found: ID does not exist" containerID="25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612" Oct 03 14:40:36 crc kubenswrapper[4962]: I1003 14:40:36.954028 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612"} err="failed to get container status \"25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612\": rpc error: code = NotFound desc = could not find container \"25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612\": container with ID starting with 25bf0f115c524665d53250e4c2fdfda7a0b2d349e8df90adbc01e3dfe947e612 not found: ID does not exist" Oct 03 14:40:38 crc kubenswrapper[4962]: I1003 14:40:38.242220 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" path="/var/lib/kubelet/pods/3a1e2014-a326-494b-b87e-3d102b91652e/volumes" Oct 03 14:40:47 crc kubenswrapper[4962]: I1003 14:40:47.227358 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:40:47 crc kubenswrapper[4962]: E1003 14:40:47.228022 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.446768 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-676db89ff9-qgzxp"] Oct 03 14:40:55 crc kubenswrapper[4962]: E1003 14:40:55.447900 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" containerName="extract-content" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.447920 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" containerName="extract-content" Oct 03 14:40:55 crc kubenswrapper[4962]: E1003 14:40:55.447950 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" containerName="registry-server" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.447958 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" containerName="registry-server" Oct 03 14:40:55 crc kubenswrapper[4962]: E1003 14:40:55.447978 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" containerName="extract-utilities" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.447987 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" containerName="extract-utilities" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.448256 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a1e2014-a326-494b-b87e-3d102b91652e" containerName="registry-server" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.449820 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.463939 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.468079 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-676db89ff9-qgzxp"] Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.590710 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-nb\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.590765 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-dns-svc\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.590786 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-sb\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.590858 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-openstack-cell1\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.590990 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-config\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.591042 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6hk7\" (UniqueName: \"kubernetes.io/projected/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-kube-api-access-p6hk7\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.692656 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-nb\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.692710 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-dns-svc\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.692731 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-sb\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.692784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-openstack-cell1\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.692869 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-config\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.692904 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6hk7\" (UniqueName: \"kubernetes.io/projected/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-kube-api-access-p6hk7\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.694608 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-sb\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.694902 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-openstack-cell1\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.694984 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-config\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.694993 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-dns-svc\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.695094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-nb\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.718106 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6hk7\" (UniqueName: \"kubernetes.io/projected/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-kube-api-access-p6hk7\") pod \"dnsmasq-dns-676db89ff9-qgzxp\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:55 crc kubenswrapper[4962]: I1003 14:40:55.786425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:56 crc kubenswrapper[4962]: I1003 14:40:56.291291 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-676db89ff9-qgzxp"] Oct 03 14:40:57 crc kubenswrapper[4962]: I1003 14:40:57.053395 4962 generic.go:334] "Generic (PLEG): container finished" podID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" containerID="b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33" exitCode=0 Oct 03 14:40:57 crc kubenswrapper[4962]: I1003 14:40:57.053442 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" event={"ID":"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8","Type":"ContainerDied","Data":"b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33"} Oct 03 14:40:57 crc kubenswrapper[4962]: I1003 14:40:57.054269 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" event={"ID":"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8","Type":"ContainerStarted","Data":"b67c1157aff663ab00728aff979a4ce65bd525bb823ead9415f3ff5e53a0dd04"} Oct 03 14:40:58 crc kubenswrapper[4962]: I1003 14:40:58.065683 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" event={"ID":"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8","Type":"ContainerStarted","Data":"90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b"} Oct 03 14:40:58 crc kubenswrapper[4962]: I1003 14:40:58.066158 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:40:58 crc kubenswrapper[4962]: I1003 14:40:58.088101 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" podStartSLOduration=3.088083456 podStartE2EDuration="3.088083456s" podCreationTimestamp="2025-10-03 14:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:40:58.080382361 +0000 UTC m=+6666.484280206" watchObservedRunningTime="2025-10-03 14:40:58.088083456 +0000 UTC m=+6666.491981291" Oct 03 14:41:02 crc kubenswrapper[4962]: I1003 14:41:02.237082 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:41:02 crc kubenswrapper[4962]: E1003 14:41:02.238052 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:41:05 crc kubenswrapper[4962]: I1003 14:41:05.788513 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:41:05 crc kubenswrapper[4962]: I1003 14:41:05.859600 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b74d5677c-vqlcm"] Oct 03 14:41:05 crc kubenswrapper[4962]: I1003 14:41:05.859994 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" podUID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" containerName="dnsmasq-dns" containerID="cri-o://8a11b24b46677cc27fe9bbf875625c223be5cb61c5819ecc86fb66eb9b50c14d" gracePeriod=10 Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.019488 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b449d676c-2nhlt"] Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.021512 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.057041 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b449d676c-2nhlt"] Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.132966 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pnmv\" (UniqueName: \"kubernetes.io/projected/57a2d47f-f545-4930-8fa1-4833e0b03da3-kube-api-access-2pnmv\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.133266 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-ovsdbserver-nb\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.133406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-openstack-cell1\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.133578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-config\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.133708 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-dns-svc\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.133792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.167793 4962 generic.go:334] "Generic (PLEG): container finished" podID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" containerID="8a11b24b46677cc27fe9bbf875625c223be5cb61c5819ecc86fb66eb9b50c14d" exitCode=0 Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.167846 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" event={"ID":"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805","Type":"ContainerDied","Data":"8a11b24b46677cc27fe9bbf875625c223be5cb61c5819ecc86fb66eb9b50c14d"} Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.235294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-config\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.235342 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-dns-svc\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.235358 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.235447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pnmv\" (UniqueName: \"kubernetes.io/projected/57a2d47f-f545-4930-8fa1-4833e0b03da3-kube-api-access-2pnmv\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.235506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-ovsdbserver-nb\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.235534 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-openstack-cell1\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.236616 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.236760 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-openstack-cell1\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.236973 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-ovsdbserver-nb\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.237275 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-config\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.237509 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57a2d47f-f545-4930-8fa1-4833e0b03da3-dns-svc\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.253884 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pnmv\" (UniqueName: \"kubernetes.io/projected/57a2d47f-f545-4930-8fa1-4833e0b03da3-kube-api-access-2pnmv\") pod \"dnsmasq-dns-5b449d676c-2nhlt\" (UID: \"57a2d47f-f545-4930-8fa1-4833e0b03da3\") " pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.350193 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.554099 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.655996 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-dns-svc\") pod \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.656286 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-config\") pod \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.656429 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-nb\") pod \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.656493 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-sb\") pod \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.656659 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvqmj\" (UniqueName: \"kubernetes.io/projected/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-kube-api-access-wvqmj\") pod \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\" (UID: \"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805\") " Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.665672 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-kube-api-access-wvqmj" (OuterVolumeSpecName: "kube-api-access-wvqmj") pod "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" (UID: "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805"). InnerVolumeSpecName "kube-api-access-wvqmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.722457 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" (UID: "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.747327 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" (UID: "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.759522 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvqmj\" (UniqueName: \"kubernetes.io/projected/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-kube-api-access-wvqmj\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.759561 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.759571 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.762384 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" (UID: "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.766536 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-config" (OuterVolumeSpecName: "config") pod "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" (UID: "b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.861396 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.861431 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:06 crc kubenswrapper[4962]: I1003 14:41:06.901239 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b449d676c-2nhlt"] Oct 03 14:41:06 crc kubenswrapper[4962]: W1003 14:41:06.906168 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a2d47f_f545_4930_8fa1_4833e0b03da3.slice/crio-eb1a8a5029b8001839dcd80127a639aedd2c386818d07c12b84b61cdab8fb9b6 WatchSource:0}: Error finding container eb1a8a5029b8001839dcd80127a639aedd2c386818d07c12b84b61cdab8fb9b6: Status 404 returned error can't find the container with id eb1a8a5029b8001839dcd80127a639aedd2c386818d07c12b84b61cdab8fb9b6 Oct 03 14:41:07 crc kubenswrapper[4962]: I1003 14:41:07.185459 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" event={"ID":"b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805","Type":"ContainerDied","Data":"8b614a8b851786c9014f6e9fdbe0de4bb9c53ebacbccc6d5fe1b8620484280ae"} Oct 03 14:41:07 crc kubenswrapper[4962]: I1003 14:41:07.185583 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b74d5677c-vqlcm" Oct 03 14:41:07 crc kubenswrapper[4962]: I1003 14:41:07.185882 4962 scope.go:117] "RemoveContainer" containerID="8a11b24b46677cc27fe9bbf875625c223be5cb61c5819ecc86fb66eb9b50c14d" Oct 03 14:41:07 crc kubenswrapper[4962]: I1003 14:41:07.187194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" event={"ID":"57a2d47f-f545-4930-8fa1-4833e0b03da3","Type":"ContainerStarted","Data":"eb1a8a5029b8001839dcd80127a639aedd2c386818d07c12b84b61cdab8fb9b6"} Oct 03 14:41:07 crc kubenswrapper[4962]: I1003 14:41:07.215560 4962 scope.go:117] "RemoveContainer" containerID="5d64a2a476ff200465bd6d1813460740ea54d4cff4bb5359082057a36b430d68" Oct 03 14:41:07 crc kubenswrapper[4962]: I1003 14:41:07.248948 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b74d5677c-vqlcm"] Oct 03 14:41:07 crc kubenswrapper[4962]: I1003 14:41:07.265404 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b74d5677c-vqlcm"] Oct 03 14:41:08 crc kubenswrapper[4962]: I1003 14:41:08.265697 4962 generic.go:334] "Generic (PLEG): container finished" podID="57a2d47f-f545-4930-8fa1-4833e0b03da3" containerID="2e2ce9c5239eac18c8a4385e25ff132c8379b39fcbf99b84cac52491fb04e720" exitCode=0 Oct 03 14:41:08 crc kubenswrapper[4962]: I1003 14:41:08.291941 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" path="/var/lib/kubelet/pods/b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805/volumes" Oct 03 14:41:08 crc kubenswrapper[4962]: I1003 14:41:08.292654 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" event={"ID":"57a2d47f-f545-4930-8fa1-4833e0b03da3","Type":"ContainerDied","Data":"2e2ce9c5239eac18c8a4385e25ff132c8379b39fcbf99b84cac52491fb04e720"} Oct 03 14:41:09 crc kubenswrapper[4962]: I1003 14:41:09.275493 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" event={"ID":"57a2d47f-f545-4930-8fa1-4833e0b03da3","Type":"ContainerStarted","Data":"ce3cfb97824aa0699a4295e2724d9f1ec369345ccce53f42216ff9bdc04bb2cd"} Oct 03 14:41:09 crc kubenswrapper[4962]: I1003 14:41:09.276003 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:09 crc kubenswrapper[4962]: I1003 14:41:09.301293 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" podStartSLOduration=4.301277762 podStartE2EDuration="4.301277762s" podCreationTimestamp="2025-10-03 14:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:41:09.291030658 +0000 UTC m=+6677.694928513" watchObservedRunningTime="2025-10-03 14:41:09.301277762 +0000 UTC m=+6677.705175597" Oct 03 14:41:14 crc kubenswrapper[4962]: I1003 14:41:14.232278 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:41:14 crc kubenswrapper[4962]: E1003 14:41:14.233318 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:41:16 crc kubenswrapper[4962]: I1003 14:41:16.353758 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b449d676c-2nhlt" Oct 03 14:41:16 crc kubenswrapper[4962]: I1003 14:41:16.443895 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-676db89ff9-qgzxp"] Oct 03 14:41:16 crc kubenswrapper[4962]: I1003 14:41:16.444245 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" podUID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" containerName="dnsmasq-dns" containerID="cri-o://90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b" gracePeriod=10 Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.077810 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.201116 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6hk7\" (UniqueName: \"kubernetes.io/projected/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-kube-api-access-p6hk7\") pod \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.201162 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-dns-svc\") pod \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.201298 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-nb\") pod \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.201347 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-openstack-cell1\") pod \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.201503 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-config\") pod \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.201563 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-sb\") pod \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\" (UID: \"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8\") " Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.209409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-kube-api-access-p6hk7" (OuterVolumeSpecName: "kube-api-access-p6hk7") pod "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" (UID: "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8"). InnerVolumeSpecName "kube-api-access-p6hk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.271431 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" (UID: "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.272490 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" (UID: "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.277767 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" (UID: "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.290243 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-config" (OuterVolumeSpecName: "config") pod "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" (UID: "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.300212 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" (UID: "f3810707-a0b2-4cf8-a9cb-68a51b9f44d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.309613 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6hk7\" (UniqueName: \"kubernetes.io/projected/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-kube-api-access-p6hk7\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.309658 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.309668 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.309678 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.309690 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.309698 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.349196 4962 generic.go:334] "Generic (PLEG): container finished" podID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" containerID="90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b" exitCode=0 Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.349241 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" event={"ID":"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8","Type":"ContainerDied","Data":"90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b"} Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.349268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" event={"ID":"f3810707-a0b2-4cf8-a9cb-68a51b9f44d8","Type":"ContainerDied","Data":"b67c1157aff663ab00728aff979a4ce65bd525bb823ead9415f3ff5e53a0dd04"} Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.349268 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676db89ff9-qgzxp" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.349284 4962 scope.go:117] "RemoveContainer" containerID="90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.385175 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-676db89ff9-qgzxp"] Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.391905 4962 scope.go:117] "RemoveContainer" containerID="b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.396012 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-676db89ff9-qgzxp"] Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.418056 4962 scope.go:117] "RemoveContainer" containerID="90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b" Oct 03 14:41:17 crc kubenswrapper[4962]: E1003 14:41:17.418461 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b\": container with ID starting with 90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b not found: ID does not exist" containerID="90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.418510 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b"} err="failed to get container status \"90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b\": rpc error: code = NotFound desc = could not find container \"90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b\": container with ID starting with 90bba2b6b9532fd6619ce7078ded020af397163eba609ab58192e3d742e2f37b not found: ID does not exist" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.418530 4962 scope.go:117] "RemoveContainer" containerID="b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33" Oct 03 14:41:17 crc kubenswrapper[4962]: E1003 14:41:17.418983 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33\": container with ID starting with b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33 not found: ID does not exist" containerID="b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33" Oct 03 14:41:17 crc kubenswrapper[4962]: I1003 14:41:17.419119 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33"} err="failed to get container status \"b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33\": rpc error: code = NotFound desc = could not find container \"b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33\": container with ID starting with b65b91fb6ae4bd2139305523923cd91b937fa5d5530cf4264b9ee3451a82dd33 not found: ID does not exist" Oct 03 14:41:18 crc kubenswrapper[4962]: I1003 14:41:18.243045 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" path="/var/lib/kubelet/pods/f3810707-a0b2-4cf8-a9cb-68a51b9f44d8/volumes" Oct 03 14:41:19 crc kubenswrapper[4962]: I1003 14:41:19.090086 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-752wv"] Oct 03 14:41:19 crc kubenswrapper[4962]: I1003 14:41:19.102923 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-752wv"] Oct 03 14:41:20 crc kubenswrapper[4962]: I1003 14:41:20.248671 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd" path="/var/lib/kubelet/pods/3cc58d2d-aac1-43a9-bbbd-3c2f6bdac4fd/volumes" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.418969 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw"] Oct 03 14:41:27 crc kubenswrapper[4962]: E1003 14:41:27.419956 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" containerName="dnsmasq-dns" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.419970 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" containerName="dnsmasq-dns" Oct 03 14:41:27 crc kubenswrapper[4962]: E1003 14:41:27.419992 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" containerName="init" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.419998 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" containerName="init" Oct 03 14:41:27 crc kubenswrapper[4962]: E1003 14:41:27.420016 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" containerName="init" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.420022 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" containerName="init" Oct 03 14:41:27 crc kubenswrapper[4962]: E1003 14:41:27.420038 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" containerName="dnsmasq-dns" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.420044 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" containerName="dnsmasq-dns" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.420263 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5bcf41e-bc2f-4f59-b0ca-a1ff18fc8805" containerName="dnsmasq-dns" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.420286 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3810707-a0b2-4cf8-a9cb-68a51b9f44d8" containerName="dnsmasq-dns" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.420985 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.423023 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.424601 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.424725 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.424964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.482039 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw"] Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.547893 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.547982 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.548020 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xkjg\" (UniqueName: \"kubernetes.io/projected/7565767b-b366-48c0-a4ed-5244d16e32a1-kube-api-access-6xkjg\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.548052 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.548094 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.650215 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.650278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xkjg\" (UniqueName: \"kubernetes.io/projected/7565767b-b366-48c0-a4ed-5244d16e32a1-kube-api-access-6xkjg\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.650314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.650359 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.650456 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.657400 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.658578 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.658582 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.659103 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.666191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xkjg\" (UniqueName: \"kubernetes.io/projected/7565767b-b366-48c0-a4ed-5244d16e32a1-kube-api-access-6xkjg\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:27 crc kubenswrapper[4962]: I1003 14:41:27.746757 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:28 crc kubenswrapper[4962]: I1003 14:41:28.227198 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:41:28 crc kubenswrapper[4962]: E1003 14:41:28.227701 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:41:28 crc kubenswrapper[4962]: I1003 14:41:28.372339 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw"] Oct 03 14:41:28 crc kubenswrapper[4962]: I1003 14:41:28.472270 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" event={"ID":"7565767b-b366-48c0-a4ed-5244d16e32a1","Type":"ContainerStarted","Data":"0251c3b2abd405545d8708ad525724865c0a75e122f3b4b53d74d3ab627baeee"} Oct 03 14:41:31 crc kubenswrapper[4962]: I1003 14:41:31.033274 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-145e-account-create-thj75"] Oct 03 14:41:31 crc kubenswrapper[4962]: I1003 14:41:31.049572 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-145e-account-create-thj75"] Oct 03 14:41:32 crc kubenswrapper[4962]: I1003 14:41:32.237480 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228a96ce-8078-4c26-b1d7-3076c08ce289" path="/var/lib/kubelet/pods/228a96ce-8078-4c26-b1d7-3076c08ce289/volumes" Oct 03 14:41:37 crc kubenswrapper[4962]: I1003 14:41:37.028531 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-ztn25"] Oct 03 14:41:37 crc kubenswrapper[4962]: I1003 14:41:37.057864 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-ztn25"] Oct 03 14:41:38 crc kubenswrapper[4962]: I1003 14:41:38.263168 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7a61fb-8407-4c74-8b41-896d4358bd99" path="/var/lib/kubelet/pods/4c7a61fb-8407-4c74-8b41-896d4358bd99/volumes" Oct 03 14:41:42 crc kubenswrapper[4962]: I1003 14:41:42.233702 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:41:42 crc kubenswrapper[4962]: E1003 14:41:42.234452 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:41:43 crc kubenswrapper[4962]: I1003 14:41:43.638116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" event={"ID":"7565767b-b366-48c0-a4ed-5244d16e32a1","Type":"ContainerStarted","Data":"cecb92c7d040f40736a66ba0ec3d16b0b63bc63386dfb36dab7b6963b24bbef5"} Oct 03 14:41:43 crc kubenswrapper[4962]: I1003 14:41:43.661060 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" podStartSLOduration=1.704098007 podStartE2EDuration="16.661037248s" podCreationTimestamp="2025-10-03 14:41:27 +0000 UTC" firstStartedPulling="2025-10-03 14:41:28.376064276 +0000 UTC m=+6696.779962111" lastFinishedPulling="2025-10-03 14:41:43.333003517 +0000 UTC m=+6711.736901352" observedRunningTime="2025-10-03 14:41:43.654593946 +0000 UTC m=+6712.058491831" watchObservedRunningTime="2025-10-03 14:41:43.661037248 +0000 UTC m=+6712.064935093" Oct 03 14:41:49 crc kubenswrapper[4962]: I1003 14:41:49.049441 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-265e-account-create-v5grn"] Oct 03 14:41:49 crc kubenswrapper[4962]: I1003 14:41:49.060344 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-265e-account-create-v5grn"] Oct 03 14:41:50 crc kubenswrapper[4962]: I1003 14:41:50.239461 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e67e61-5d84-4b90-9d46-60a5ddb1e357" path="/var/lib/kubelet/pods/e1e67e61-5d84-4b90-9d46-60a5ddb1e357/volumes" Oct 03 14:41:56 crc kubenswrapper[4962]: I1003 14:41:56.227332 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:41:56 crc kubenswrapper[4962]: E1003 14:41:56.228145 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:41:57 crc kubenswrapper[4962]: I1003 14:41:57.774490 4962 generic.go:334] "Generic (PLEG): container finished" podID="7565767b-b366-48c0-a4ed-5244d16e32a1" containerID="cecb92c7d040f40736a66ba0ec3d16b0b63bc63386dfb36dab7b6963b24bbef5" exitCode=0 Oct 03 14:41:57 crc kubenswrapper[4962]: I1003 14:41:57.774842 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" event={"ID":"7565767b-b366-48c0-a4ed-5244d16e32a1","Type":"ContainerDied","Data":"cecb92c7d040f40736a66ba0ec3d16b0b63bc63386dfb36dab7b6963b24bbef5"} Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.310119 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.468520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ceph\") pod \"7565767b-b366-48c0-a4ed-5244d16e32a1\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.468584 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-pre-adoption-validation-combined-ca-bundle\") pod \"7565767b-b366-48c0-a4ed-5244d16e32a1\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.469091 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xkjg\" (UniqueName: \"kubernetes.io/projected/7565767b-b366-48c0-a4ed-5244d16e32a1-kube-api-access-6xkjg\") pod \"7565767b-b366-48c0-a4ed-5244d16e32a1\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.469149 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-inventory\") pod \"7565767b-b366-48c0-a4ed-5244d16e32a1\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.469243 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ssh-key\") pod \"7565767b-b366-48c0-a4ed-5244d16e32a1\" (UID: \"7565767b-b366-48c0-a4ed-5244d16e32a1\") " Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.476242 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ceph" (OuterVolumeSpecName: "ceph") pod "7565767b-b366-48c0-a4ed-5244d16e32a1" (UID: "7565767b-b366-48c0-a4ed-5244d16e32a1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.477608 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7565767b-b366-48c0-a4ed-5244d16e32a1-kube-api-access-6xkjg" (OuterVolumeSpecName: "kube-api-access-6xkjg") pod "7565767b-b366-48c0-a4ed-5244d16e32a1" (UID: "7565767b-b366-48c0-a4ed-5244d16e32a1"). InnerVolumeSpecName "kube-api-access-6xkjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.478450 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "7565767b-b366-48c0-a4ed-5244d16e32a1" (UID: "7565767b-b366-48c0-a4ed-5244d16e32a1"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.506466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-inventory" (OuterVolumeSpecName: "inventory") pod "7565767b-b366-48c0-a4ed-5244d16e32a1" (UID: "7565767b-b366-48c0-a4ed-5244d16e32a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.506899 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7565767b-b366-48c0-a4ed-5244d16e32a1" (UID: "7565767b-b366-48c0-a4ed-5244d16e32a1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.571618 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.571673 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.571685 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.571698 4962 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7565767b-b366-48c0-a4ed-5244d16e32a1-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.571713 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xkjg\" (UniqueName: \"kubernetes.io/projected/7565767b-b366-48c0-a4ed-5244d16e32a1-kube-api-access-6xkjg\") on node \"crc\" DevicePath \"\"" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.792845 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" event={"ID":"7565767b-b366-48c0-a4ed-5244d16e32a1","Type":"ContainerDied","Data":"0251c3b2abd405545d8708ad525724865c0a75e122f3b4b53d74d3ab627baeee"} Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.792881 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw" Oct 03 14:41:59 crc kubenswrapper[4962]: I1003 14:41:59.792893 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0251c3b2abd405545d8708ad525724865c0a75e122f3b4b53d74d3ab627baeee" Oct 03 14:42:10 crc kubenswrapper[4962]: I1003 14:42:10.227730 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:42:10 crc kubenswrapper[4962]: E1003 14:42:10.228772 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.080785 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh"] Oct 03 14:42:11 crc kubenswrapper[4962]: E1003 14:42:11.081332 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7565767b-b366-48c0-a4ed-5244d16e32a1" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.081355 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7565767b-b366-48c0-a4ed-5244d16e32a1" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.081613 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7565767b-b366-48c0-a4ed-5244d16e32a1" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.082574 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.086365 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.086910 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.087447 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.087452 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.108495 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh"] Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.223538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqsf8\" (UniqueName: \"kubernetes.io/projected/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-kube-api-access-mqsf8\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.223740 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.223807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.223863 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.224177 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.328224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.328896 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.329007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.329142 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqsf8\" (UniqueName: \"kubernetes.io/projected/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-kube-api-access-mqsf8\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.329294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.335736 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.336243 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.337844 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.338219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.347230 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqsf8\" (UniqueName: \"kubernetes.io/projected/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-kube-api-access-mqsf8\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.409833 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:42:11 crc kubenswrapper[4962]: I1003 14:42:11.949211 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh"] Oct 03 14:42:12 crc kubenswrapper[4962]: I1003 14:42:12.699218 4962 scope.go:117] "RemoveContainer" containerID="ca2295248d66644e2b73ef73c1e4e45f9c23c971a2253e1303fd3cdd9bd1c747" Oct 03 14:42:12 crc kubenswrapper[4962]: I1003 14:42:12.725285 4962 scope.go:117] "RemoveContainer" containerID="8e26f9f44b9e894b8da15898b7edcb7c8ab0f116f64b33193c2d0c31ddbd26f9" Oct 03 14:42:12 crc kubenswrapper[4962]: I1003 14:42:12.771609 4962 scope.go:117] "RemoveContainer" containerID="5a1bdff7bb38514a8be6c5269d8f2629161b89a24aecf1b8bf4deedd63e6bf78" Oct 03 14:42:12 crc kubenswrapper[4962]: I1003 14:42:12.815484 4962 scope.go:117] "RemoveContainer" containerID="b169f05aa8f299a544bfcf9a5d61dc22ead3731932e101f65a740d39a7567964" Oct 03 14:42:12 crc kubenswrapper[4962]: I1003 14:42:12.925825 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" event={"ID":"429b4a0d-c23b-4c71-8844-7ec3cd482a4f","Type":"ContainerStarted","Data":"c9c086dde3c6b8c570f2df74b05540a9e4c4dc12e43ea9bf9bfe9bdccd5248d7"} Oct 03 14:42:12 crc kubenswrapper[4962]: I1003 14:42:12.925870 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" event={"ID":"429b4a0d-c23b-4c71-8844-7ec3cd482a4f","Type":"ContainerStarted","Data":"8c04daab3a33ebdb47eb4d52336c003ed13c86b47e86cc771a68adb9d5f2c3e1"} Oct 03 14:42:12 crc kubenswrapper[4962]: I1003 14:42:12.951324 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" podStartSLOduration=1.799741868 podStartE2EDuration="1.951305062s" podCreationTimestamp="2025-10-03 14:42:11 +0000 UTC" firstStartedPulling="2025-10-03 14:42:11.957789206 +0000 UTC m=+6740.361687041" lastFinishedPulling="2025-10-03 14:42:12.1093524 +0000 UTC m=+6740.513250235" observedRunningTime="2025-10-03 14:42:12.942708102 +0000 UTC m=+6741.346605967" watchObservedRunningTime="2025-10-03 14:42:12.951305062 +0000 UTC m=+6741.355202897" Oct 03 14:42:25 crc kubenswrapper[4962]: I1003 14:42:25.227394 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:42:25 crc kubenswrapper[4962]: E1003 14:42:25.228165 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:42:38 crc kubenswrapper[4962]: I1003 14:42:38.228168 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:42:38 crc kubenswrapper[4962]: E1003 14:42:38.229011 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:42:46 crc kubenswrapper[4962]: I1003 14:42:46.045834 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-lnxfr"] Oct 03 14:42:46 crc kubenswrapper[4962]: I1003 14:42:46.056010 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-lnxfr"] Oct 03 14:42:46 crc kubenswrapper[4962]: I1003 14:42:46.246425 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e54df0d5-846e-417d-bfc0-98804487ed5f" path="/var/lib/kubelet/pods/e54df0d5-846e-417d-bfc0-98804487ed5f/volumes" Oct 03 14:42:52 crc kubenswrapper[4962]: I1003 14:42:52.240595 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:42:52 crc kubenswrapper[4962]: E1003 14:42:52.241788 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:43:07 crc kubenswrapper[4962]: I1003 14:43:07.227194 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:43:08 crc kubenswrapper[4962]: I1003 14:43:08.480658 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"2456cc5807e33f67351b02ef43a646828ba891bf3f306896eda2c0ba865fddfd"} Oct 03 14:43:13 crc kubenswrapper[4962]: I1003 14:43:13.005374 4962 scope.go:117] "RemoveContainer" containerID="b9ca7288d0995161794e1c0fcb55eb394c8071920f1ffc6215cf336ab9f77ad9" Oct 03 14:43:13 crc kubenswrapper[4962]: I1003 14:43:13.044752 4962 scope.go:117] "RemoveContainer" containerID="809ccc053df14594b353e17065f4067b6c0fff8a91f8e30f6137c0e7fc360d97" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.135669 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mtcmf"] Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.138958 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.144161 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mtcmf"] Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.281964 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5nt\" (UniqueName: \"kubernetes.io/projected/3d6b9057-1871-4a3c-a345-756906a23352-kube-api-access-bs5nt\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.282187 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-catalog-content\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.282307 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-utilities\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.384268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5nt\" (UniqueName: \"kubernetes.io/projected/3d6b9057-1871-4a3c-a345-756906a23352-kube-api-access-bs5nt\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.384372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-catalog-content\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.384401 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-utilities\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.384990 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-utilities\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.385084 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-catalog-content\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.406434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5nt\" (UniqueName: \"kubernetes.io/projected/3d6b9057-1871-4a3c-a345-756906a23352-kube-api-access-bs5nt\") pod \"certified-operators-mtcmf\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.465126 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:41 crc kubenswrapper[4962]: I1003 14:43:41.920181 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mtcmf"] Oct 03 14:43:42 crc kubenswrapper[4962]: I1003 14:43:42.838765 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d6b9057-1871-4a3c-a345-756906a23352" containerID="ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d" exitCode=0 Oct 03 14:43:42 crc kubenswrapper[4962]: I1003 14:43:42.838978 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtcmf" event={"ID":"3d6b9057-1871-4a3c-a345-756906a23352","Type":"ContainerDied","Data":"ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d"} Oct 03 14:43:42 crc kubenswrapper[4962]: I1003 14:43:42.839104 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtcmf" event={"ID":"3d6b9057-1871-4a3c-a345-756906a23352","Type":"ContainerStarted","Data":"814013ae944547af814d9a3a8ef492521012a647c5a4ed8fbd010b113b791432"} Oct 03 14:43:43 crc kubenswrapper[4962]: I1003 14:43:43.850834 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtcmf" event={"ID":"3d6b9057-1871-4a3c-a345-756906a23352","Type":"ContainerStarted","Data":"71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c"} Oct 03 14:43:45 crc kubenswrapper[4962]: I1003 14:43:45.873870 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d6b9057-1871-4a3c-a345-756906a23352" containerID="71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c" exitCode=0 Oct 03 14:43:45 crc kubenswrapper[4962]: I1003 14:43:45.874048 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtcmf" event={"ID":"3d6b9057-1871-4a3c-a345-756906a23352","Type":"ContainerDied","Data":"71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c"} Oct 03 14:43:46 crc kubenswrapper[4962]: I1003 14:43:46.895288 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtcmf" event={"ID":"3d6b9057-1871-4a3c-a345-756906a23352","Type":"ContainerStarted","Data":"7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1"} Oct 03 14:43:46 crc kubenswrapper[4962]: I1003 14:43:46.929189 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mtcmf" podStartSLOduration=2.299011029 podStartE2EDuration="5.929170814s" podCreationTimestamp="2025-10-03 14:43:41 +0000 UTC" firstStartedPulling="2025-10-03 14:43:42.84175713 +0000 UTC m=+6831.245654965" lastFinishedPulling="2025-10-03 14:43:46.471916915 +0000 UTC m=+6834.875814750" observedRunningTime="2025-10-03 14:43:46.920829582 +0000 UTC m=+6835.324727437" watchObservedRunningTime="2025-10-03 14:43:46.929170814 +0000 UTC m=+6835.333068649" Oct 03 14:43:51 crc kubenswrapper[4962]: I1003 14:43:51.465557 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:51 crc kubenswrapper[4962]: I1003 14:43:51.466111 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:51 crc kubenswrapper[4962]: I1003 14:43:51.520955 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:51 crc kubenswrapper[4962]: I1003 14:43:51.989125 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:53 crc kubenswrapper[4962]: I1003 14:43:53.510494 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mtcmf"] Oct 03 14:43:53 crc kubenswrapper[4962]: I1003 14:43:53.961589 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mtcmf" podUID="3d6b9057-1871-4a3c-a345-756906a23352" containerName="registry-server" containerID="cri-o://7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1" gracePeriod=2 Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.446100 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.592348 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-utilities\") pod \"3d6b9057-1871-4a3c-a345-756906a23352\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.592438 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-catalog-content\") pod \"3d6b9057-1871-4a3c-a345-756906a23352\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.592469 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs5nt\" (UniqueName: \"kubernetes.io/projected/3d6b9057-1871-4a3c-a345-756906a23352-kube-api-access-bs5nt\") pod \"3d6b9057-1871-4a3c-a345-756906a23352\" (UID: \"3d6b9057-1871-4a3c-a345-756906a23352\") " Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.593594 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-utilities" (OuterVolumeSpecName: "utilities") pod "3d6b9057-1871-4a3c-a345-756906a23352" (UID: "3d6b9057-1871-4a3c-a345-756906a23352"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.599835 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6b9057-1871-4a3c-a345-756906a23352-kube-api-access-bs5nt" (OuterVolumeSpecName: "kube-api-access-bs5nt") pod "3d6b9057-1871-4a3c-a345-756906a23352" (UID: "3d6b9057-1871-4a3c-a345-756906a23352"). InnerVolumeSpecName "kube-api-access-bs5nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.650874 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d6b9057-1871-4a3c-a345-756906a23352" (UID: "3d6b9057-1871-4a3c-a345-756906a23352"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.694574 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.694626 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs5nt\" (UniqueName: \"kubernetes.io/projected/3d6b9057-1871-4a3c-a345-756906a23352-kube-api-access-bs5nt\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.694660 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6b9057-1871-4a3c-a345-756906a23352-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.973279 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d6b9057-1871-4a3c-a345-756906a23352" containerID="7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1" exitCode=0 Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.973348 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtcmf" Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.973335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtcmf" event={"ID":"3d6b9057-1871-4a3c-a345-756906a23352","Type":"ContainerDied","Data":"7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1"} Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.973491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtcmf" event={"ID":"3d6b9057-1871-4a3c-a345-756906a23352","Type":"ContainerDied","Data":"814013ae944547af814d9a3a8ef492521012a647c5a4ed8fbd010b113b791432"} Oct 03 14:43:54 crc kubenswrapper[4962]: I1003 14:43:54.973514 4962 scope.go:117] "RemoveContainer" containerID="7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1" Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.002504 4962 scope.go:117] "RemoveContainer" containerID="71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c" Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.019363 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mtcmf"] Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.041208 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mtcmf"] Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.051520 4962 scope.go:117] "RemoveContainer" containerID="ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d" Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.087404 4962 scope.go:117] "RemoveContainer" containerID="7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1" Oct 03 14:43:55 crc kubenswrapper[4962]: E1003 14:43:55.087991 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1\": container with ID starting with 7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1 not found: ID does not exist" containerID="7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1" Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.088153 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1"} err="failed to get container status \"7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1\": rpc error: code = NotFound desc = could not find container \"7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1\": container with ID starting with 7bf786634c1f24bd6ffe7e47c016ab7a3e049f60f6530b1a7c5e76b6075661b1 not found: ID does not exist" Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.088242 4962 scope.go:117] "RemoveContainer" containerID="71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c" Oct 03 14:43:55 crc kubenswrapper[4962]: E1003 14:43:55.088656 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c\": container with ID starting with 71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c not found: ID does not exist" containerID="71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c" Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.088758 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c"} err="failed to get container status \"71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c\": rpc error: code = NotFound desc = could not find container \"71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c\": container with ID starting with 71ac34810a38118a69e9935f7b6b3b0461616373cc3a522da09ec62b531cca5c not found: ID does not exist" Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.088828 4962 scope.go:117] "RemoveContainer" containerID="ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d" Oct 03 14:43:55 crc kubenswrapper[4962]: E1003 14:43:55.089199 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d\": container with ID starting with ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d not found: ID does not exist" containerID="ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d" Oct 03 14:43:55 crc kubenswrapper[4962]: I1003 14:43:55.089301 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d"} err="failed to get container status \"ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d\": rpc error: code = NotFound desc = could not find container \"ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d\": container with ID starting with ac51cac2e98ebf958b78fcc5400e15443ecc4e257aea4f26b04af7c2b613b49d not found: ID does not exist" Oct 03 14:43:56 crc kubenswrapper[4962]: I1003 14:43:56.242240 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6b9057-1871-4a3c-a345-756906a23352" path="/var/lib/kubelet/pods/3d6b9057-1871-4a3c-a345-756906a23352/volumes" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.159041 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc"] Oct 03 14:45:00 crc kubenswrapper[4962]: E1003 14:45:00.160319 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6b9057-1871-4a3c-a345-756906a23352" containerName="registry-server" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.160335 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6b9057-1871-4a3c-a345-756906a23352" containerName="registry-server" Oct 03 14:45:00 crc kubenswrapper[4962]: E1003 14:45:00.160353 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6b9057-1871-4a3c-a345-756906a23352" containerName="extract-content" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.160359 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6b9057-1871-4a3c-a345-756906a23352" containerName="extract-content" Oct 03 14:45:00 crc kubenswrapper[4962]: E1003 14:45:00.160375 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6b9057-1871-4a3c-a345-756906a23352" containerName="extract-utilities" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.160381 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6b9057-1871-4a3c-a345-756906a23352" containerName="extract-utilities" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.160599 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6b9057-1871-4a3c-a345-756906a23352" containerName="registry-server" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.161483 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.163687 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.165882 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.171953 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc"] Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.297095 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b90dcf00-a383-48b4-9fa3-0867587b4341-config-volume\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.297269 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhwd\" (UniqueName: \"kubernetes.io/projected/b90dcf00-a383-48b4-9fa3-0867587b4341-kube-api-access-kbhwd\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.297513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b90dcf00-a383-48b4-9fa3-0867587b4341-secret-volume\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.400954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b90dcf00-a383-48b4-9fa3-0867587b4341-secret-volume\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.402420 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b90dcf00-a383-48b4-9fa3-0867587b4341-config-volume\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.402855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b90dcf00-a383-48b4-9fa3-0867587b4341-config-volume\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.403200 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhwd\" (UniqueName: \"kubernetes.io/projected/b90dcf00-a383-48b4-9fa3-0867587b4341-kube-api-access-kbhwd\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.407685 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b90dcf00-a383-48b4-9fa3-0867587b4341-secret-volume\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.431393 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhwd\" (UniqueName: \"kubernetes.io/projected/b90dcf00-a383-48b4-9fa3-0867587b4341-kube-api-access-kbhwd\") pod \"collect-profiles-29325045-dwmqc\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.497203 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:00 crc kubenswrapper[4962]: I1003 14:45:00.964465 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc"] Oct 03 14:45:01 crc kubenswrapper[4962]: I1003 14:45:01.623987 4962 generic.go:334] "Generic (PLEG): container finished" podID="b90dcf00-a383-48b4-9fa3-0867587b4341" containerID="5d9ee5af611fd4b2cdc4c4fa428e1eb85dbf863c8d4afe6a37868b20c4fd0788" exitCode=0 Oct 03 14:45:01 crc kubenswrapper[4962]: I1003 14:45:01.624040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" event={"ID":"b90dcf00-a383-48b4-9fa3-0867587b4341","Type":"ContainerDied","Data":"5d9ee5af611fd4b2cdc4c4fa428e1eb85dbf863c8d4afe6a37868b20c4fd0788"} Oct 03 14:45:01 crc kubenswrapper[4962]: I1003 14:45:01.624075 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" event={"ID":"b90dcf00-a383-48b4-9fa3-0867587b4341","Type":"ContainerStarted","Data":"f3484dd664b3636faaa0923b9e296c5fea234082019cc9f4094a9687f4fd3661"} Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.001012 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.163728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbhwd\" (UniqueName: \"kubernetes.io/projected/b90dcf00-a383-48b4-9fa3-0867587b4341-kube-api-access-kbhwd\") pod \"b90dcf00-a383-48b4-9fa3-0867587b4341\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.163848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b90dcf00-a383-48b4-9fa3-0867587b4341-config-volume\") pod \"b90dcf00-a383-48b4-9fa3-0867587b4341\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.163899 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b90dcf00-a383-48b4-9fa3-0867587b4341-secret-volume\") pod \"b90dcf00-a383-48b4-9fa3-0867587b4341\" (UID: \"b90dcf00-a383-48b4-9fa3-0867587b4341\") " Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.164524 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90dcf00-a383-48b4-9fa3-0867587b4341-config-volume" (OuterVolumeSpecName: "config-volume") pod "b90dcf00-a383-48b4-9fa3-0867587b4341" (UID: "b90dcf00-a383-48b4-9fa3-0867587b4341"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.168646 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90dcf00-a383-48b4-9fa3-0867587b4341-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b90dcf00-a383-48b4-9fa3-0867587b4341" (UID: "b90dcf00-a383-48b4-9fa3-0867587b4341"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.168656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90dcf00-a383-48b4-9fa3-0867587b4341-kube-api-access-kbhwd" (OuterVolumeSpecName: "kube-api-access-kbhwd") pod "b90dcf00-a383-48b4-9fa3-0867587b4341" (UID: "b90dcf00-a383-48b4-9fa3-0867587b4341"). InnerVolumeSpecName "kube-api-access-kbhwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.266838 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbhwd\" (UniqueName: \"kubernetes.io/projected/b90dcf00-a383-48b4-9fa3-0867587b4341-kube-api-access-kbhwd\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.266875 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b90dcf00-a383-48b4-9fa3-0867587b4341-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.266884 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b90dcf00-a383-48b4-9fa3-0867587b4341-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.645066 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" event={"ID":"b90dcf00-a383-48b4-9fa3-0867587b4341","Type":"ContainerDied","Data":"f3484dd664b3636faaa0923b9e296c5fea234082019cc9f4094a9687f4fd3661"} Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.645377 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3484dd664b3636faaa0923b9e296c5fea234082019cc9f4094a9687f4fd3661" Oct 03 14:45:03 crc kubenswrapper[4962]: I1003 14:45:03.645228 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-dwmqc" Oct 03 14:45:04 crc kubenswrapper[4962]: I1003 14:45:04.086478 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6"] Oct 03 14:45:04 crc kubenswrapper[4962]: I1003 14:45:04.096799 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-dxmn6"] Oct 03 14:45:04 crc kubenswrapper[4962]: I1003 14:45:04.241166 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb16333f-5f61-4323-bc2f-4a394e9ad6bb" path="/var/lib/kubelet/pods/fb16333f-5f61-4323-bc2f-4a394e9ad6bb/volumes" Oct 03 14:45:13 crc kubenswrapper[4962]: I1003 14:45:13.194779 4962 scope.go:117] "RemoveContainer" containerID="7264fb0675d236a052f6285b069ba0c322c8da031143de6d25e453a179ec09ae" Oct 03 14:45:24 crc kubenswrapper[4962]: I1003 14:45:24.660180 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:45:24 crc kubenswrapper[4962]: I1003 14:45:24.660752 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.328282 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k7cgt"] Oct 03 14:45:38 crc kubenswrapper[4962]: E1003 14:45:38.329371 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90dcf00-a383-48b4-9fa3-0867587b4341" containerName="collect-profiles" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.329384 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90dcf00-a383-48b4-9fa3-0867587b4341" containerName="collect-profiles" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.329631 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90dcf00-a383-48b4-9fa3-0867587b4341" containerName="collect-profiles" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.331290 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.357689 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7cgt"] Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.440768 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-catalog-content\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.440893 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-utilities\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.441019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fzwg\" (UniqueName: \"kubernetes.io/projected/31e17840-7a2f-4902-ab16-1b3d1ce079ee-kube-api-access-9fzwg\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.543125 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-catalog-content\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.543534 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-utilities\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.543538 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-catalog-content\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.543697 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fzwg\" (UniqueName: \"kubernetes.io/projected/31e17840-7a2f-4902-ab16-1b3d1ce079ee-kube-api-access-9fzwg\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.543779 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-utilities\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.571683 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fzwg\" (UniqueName: \"kubernetes.io/projected/31e17840-7a2f-4902-ab16-1b3d1ce079ee-kube-api-access-9fzwg\") pod \"community-operators-k7cgt\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:38 crc kubenswrapper[4962]: I1003 14:45:38.660176 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:39 crc kubenswrapper[4962]: I1003 14:45:39.175312 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7cgt"] Oct 03 14:45:39 crc kubenswrapper[4962]: I1003 14:45:39.963356 4962 generic.go:334] "Generic (PLEG): container finished" podID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerID="df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31" exitCode=0 Oct 03 14:45:39 crc kubenswrapper[4962]: I1003 14:45:39.963417 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7cgt" event={"ID":"31e17840-7a2f-4902-ab16-1b3d1ce079ee","Type":"ContainerDied","Data":"df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31"} Oct 03 14:45:39 crc kubenswrapper[4962]: I1003 14:45:39.963685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7cgt" event={"ID":"31e17840-7a2f-4902-ab16-1b3d1ce079ee","Type":"ContainerStarted","Data":"c1648a21ae668362dbb55ca050b3cabf45509380379ac1025348d345e4c81666"} Oct 03 14:45:39 crc kubenswrapper[4962]: I1003 14:45:39.965756 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:45:40 crc kubenswrapper[4962]: I1003 14:45:40.978014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7cgt" event={"ID":"31e17840-7a2f-4902-ab16-1b3d1ce079ee","Type":"ContainerStarted","Data":"006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed"} Oct 03 14:45:41 crc kubenswrapper[4962]: I1003 14:45:41.989331 4962 generic.go:334] "Generic (PLEG): container finished" podID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerID="006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed" exitCode=0 Oct 03 14:45:41 crc kubenswrapper[4962]: I1003 14:45:41.989395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7cgt" event={"ID":"31e17840-7a2f-4902-ab16-1b3d1ce079ee","Type":"ContainerDied","Data":"006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed"} Oct 03 14:45:43 crc kubenswrapper[4962]: I1003 14:45:43.006566 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7cgt" event={"ID":"31e17840-7a2f-4902-ab16-1b3d1ce079ee","Type":"ContainerStarted","Data":"736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1"} Oct 03 14:45:43 crc kubenswrapper[4962]: I1003 14:45:43.033219 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k7cgt" podStartSLOduration=2.56688382 podStartE2EDuration="5.033198545s" podCreationTimestamp="2025-10-03 14:45:38 +0000 UTC" firstStartedPulling="2025-10-03 14:45:39.965536567 +0000 UTC m=+6948.369434402" lastFinishedPulling="2025-10-03 14:45:42.431851292 +0000 UTC m=+6950.835749127" observedRunningTime="2025-10-03 14:45:43.025736126 +0000 UTC m=+6951.429634041" watchObservedRunningTime="2025-10-03 14:45:43.033198545 +0000 UTC m=+6951.437096380" Oct 03 14:45:48 crc kubenswrapper[4962]: I1003 14:45:48.661268 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:48 crc kubenswrapper[4962]: I1003 14:45:48.661953 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:48 crc kubenswrapper[4962]: I1003 14:45:48.719729 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:49 crc kubenswrapper[4962]: I1003 14:45:49.124687 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:49 crc kubenswrapper[4962]: I1003 14:45:49.180909 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7cgt"] Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.041969 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-gw495"] Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.052477 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-gw495"] Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.090590 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k7cgt" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerName="registry-server" containerID="cri-o://736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1" gracePeriod=2 Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.573350 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.665036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fzwg\" (UniqueName: \"kubernetes.io/projected/31e17840-7a2f-4902-ab16-1b3d1ce079ee-kube-api-access-9fzwg\") pod \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.665142 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-utilities\") pod \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.665339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-catalog-content\") pod \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\" (UID: \"31e17840-7a2f-4902-ab16-1b3d1ce079ee\") " Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.666021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-utilities" (OuterVolumeSpecName: "utilities") pod "31e17840-7a2f-4902-ab16-1b3d1ce079ee" (UID: "31e17840-7a2f-4902-ab16-1b3d1ce079ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.670951 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e17840-7a2f-4902-ab16-1b3d1ce079ee-kube-api-access-9fzwg" (OuterVolumeSpecName: "kube-api-access-9fzwg") pod "31e17840-7a2f-4902-ab16-1b3d1ce079ee" (UID: "31e17840-7a2f-4902-ab16-1b3d1ce079ee"). InnerVolumeSpecName "kube-api-access-9fzwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.712268 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31e17840-7a2f-4902-ab16-1b3d1ce079ee" (UID: "31e17840-7a2f-4902-ab16-1b3d1ce079ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.769200 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.769265 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fzwg\" (UniqueName: \"kubernetes.io/projected/31e17840-7a2f-4902-ab16-1b3d1ce079ee-kube-api-access-9fzwg\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:51 crc kubenswrapper[4962]: I1003 14:45:51.769279 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31e17840-7a2f-4902-ab16-1b3d1ce079ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.099868 4962 generic.go:334] "Generic (PLEG): container finished" podID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerID="736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1" exitCode=0 Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.099925 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7cgt" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.099955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7cgt" event={"ID":"31e17840-7a2f-4902-ab16-1b3d1ce079ee","Type":"ContainerDied","Data":"736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1"} Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.100416 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7cgt" event={"ID":"31e17840-7a2f-4902-ab16-1b3d1ce079ee","Type":"ContainerDied","Data":"c1648a21ae668362dbb55ca050b3cabf45509380379ac1025348d345e4c81666"} Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.100442 4962 scope.go:117] "RemoveContainer" containerID="736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.120229 4962 scope.go:117] "RemoveContainer" containerID="006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.148937 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7cgt"] Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.156327 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k7cgt"] Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.167412 4962 scope.go:117] "RemoveContainer" containerID="df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.206358 4962 scope.go:117] "RemoveContainer" containerID="736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1" Oct 03 14:45:52 crc kubenswrapper[4962]: E1003 14:45:52.212740 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1\": container with ID starting with 736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1 not found: ID does not exist" containerID="736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.212786 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1"} err="failed to get container status \"736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1\": rpc error: code = NotFound desc = could not find container \"736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1\": container with ID starting with 736746ab973b67ebc6ef59fa5bd7410c882ee3b3ae077b8d5146196a3bc647b1 not found: ID does not exist" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.212815 4962 scope.go:117] "RemoveContainer" containerID="006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed" Oct 03 14:45:52 crc kubenswrapper[4962]: E1003 14:45:52.213248 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed\": container with ID starting with 006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed not found: ID does not exist" containerID="006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.213298 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed"} err="failed to get container status \"006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed\": rpc error: code = NotFound desc = could not find container \"006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed\": container with ID starting with 006d4e9643875e57a2262e368b4e6504793c526cb9c96acb5a63d3556524f4ed not found: ID does not exist" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.213333 4962 scope.go:117] "RemoveContainer" containerID="df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31" Oct 03 14:45:52 crc kubenswrapper[4962]: E1003 14:45:52.213853 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31\": container with ID starting with df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31 not found: ID does not exist" containerID="df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.213875 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31"} err="failed to get container status \"df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31\": rpc error: code = NotFound desc = could not find container \"df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31\": container with ID starting with df8c176f7dd8035be77c700dfa63e85bc1694256cdcecb51cdee080dabfd7b31 not found: ID does not exist" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.245817 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" path="/var/lib/kubelet/pods/31e17840-7a2f-4902-ab16-1b3d1ce079ee/volumes" Oct 03 14:45:52 crc kubenswrapper[4962]: I1003 14:45:52.246986 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd6b564a-9316-4aea-a8bc-d764e23fc7f8" path="/var/lib/kubelet/pods/cd6b564a-9316-4aea-a8bc-d764e23fc7f8/volumes" Oct 03 14:45:54 crc kubenswrapper[4962]: I1003 14:45:54.659582 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:45:54 crc kubenswrapper[4962]: I1003 14:45:54.660097 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:46:01 crc kubenswrapper[4962]: I1003 14:46:01.027970 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-dccc-account-create-7gz7x"] Oct 03 14:46:01 crc kubenswrapper[4962]: I1003 14:46:01.035612 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-dccc-account-create-7gz7x"] Oct 03 14:46:02 crc kubenswrapper[4962]: I1003 14:46:02.238971 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9adc7f7-76c2-4c5e-aca6-f8ace634f403" path="/var/lib/kubelet/pods/a9adc7f7-76c2-4c5e-aca6-f8ace634f403/volumes" Oct 03 14:46:13 crc kubenswrapper[4962]: I1003 14:46:13.258155 4962 scope.go:117] "RemoveContainer" containerID="20d504b0b4d59e6a77e8321bd3bb127b50930cf7551b1b1405b734d04f80bc45" Oct 03 14:46:13 crc kubenswrapper[4962]: I1003 14:46:13.290285 4962 scope.go:117] "RemoveContainer" containerID="3c28c0c458724d1145069d0adbd5091c578364b762c0523c131500c2920fa0e3" Oct 03 14:46:16 crc kubenswrapper[4962]: I1003 14:46:16.040703 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-mlxbc"] Oct 03 14:46:16 crc kubenswrapper[4962]: I1003 14:46:16.050267 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-mlxbc"] Oct 03 14:46:16 crc kubenswrapper[4962]: I1003 14:46:16.239201 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbbf01e6-b3e6-4e62-aca9-7f905967ed80" path="/var/lib/kubelet/pods/fbbf01e6-b3e6-4e62-aca9-7f905967ed80/volumes" Oct 03 14:46:24 crc kubenswrapper[4962]: I1003 14:46:24.660285 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:46:24 crc kubenswrapper[4962]: I1003 14:46:24.660774 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:46:24 crc kubenswrapper[4962]: I1003 14:46:24.660823 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:46:24 crc kubenswrapper[4962]: I1003 14:46:24.661709 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2456cc5807e33f67351b02ef43a646828ba891bf3f306896eda2c0ba865fddfd"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:46:24 crc kubenswrapper[4962]: I1003 14:46:24.661762 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://2456cc5807e33f67351b02ef43a646828ba891bf3f306896eda2c0ba865fddfd" gracePeriod=600 Oct 03 14:46:25 crc kubenswrapper[4962]: I1003 14:46:25.427801 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="2456cc5807e33f67351b02ef43a646828ba891bf3f306896eda2c0ba865fddfd" exitCode=0 Oct 03 14:46:25 crc kubenswrapper[4962]: I1003 14:46:25.428354 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"2456cc5807e33f67351b02ef43a646828ba891bf3f306896eda2c0ba865fddfd"} Oct 03 14:46:25 crc kubenswrapper[4962]: I1003 14:46:25.428383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e"} Oct 03 14:46:25 crc kubenswrapper[4962]: I1003 14:46:25.428401 4962 scope.go:117] "RemoveContainer" containerID="e273395719cf61a5af9b7bc53d31b2386f724aead6ba873c67b6fad6aa1c9256" Oct 03 14:47:13 crc kubenswrapper[4962]: I1003 14:47:13.402135 4962 scope.go:117] "RemoveContainer" containerID="faccd713a09335c3cee17c137068d8227a0197bda26b9189a6370ffdd1b7855d" Oct 03 14:48:15 crc kubenswrapper[4962]: I1003 14:48:15.956388 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gk7v6"] Oct 03 14:48:15 crc kubenswrapper[4962]: E1003 14:48:15.957546 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerName="registry-server" Oct 03 14:48:15 crc kubenswrapper[4962]: I1003 14:48:15.957565 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerName="registry-server" Oct 03 14:48:15 crc kubenswrapper[4962]: E1003 14:48:15.957592 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerName="extract-utilities" Oct 03 14:48:15 crc kubenswrapper[4962]: I1003 14:48:15.957600 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerName="extract-utilities" Oct 03 14:48:15 crc kubenswrapper[4962]: E1003 14:48:15.957620 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerName="extract-content" Oct 03 14:48:15 crc kubenswrapper[4962]: I1003 14:48:15.957628 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerName="extract-content" Oct 03 14:48:15 crc kubenswrapper[4962]: I1003 14:48:15.957924 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e17840-7a2f-4902-ab16-1b3d1ce079ee" containerName="registry-server" Oct 03 14:48:15 crc kubenswrapper[4962]: I1003 14:48:15.959734 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:15 crc kubenswrapper[4962]: I1003 14:48:15.966165 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gk7v6"] Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.067795 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s76\" (UniqueName: \"kubernetes.io/projected/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-kube-api-access-h8s76\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.068072 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-utilities\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.068557 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-catalog-content\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.170992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-utilities\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.171157 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-catalog-content\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.171259 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s76\" (UniqueName: \"kubernetes.io/projected/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-kube-api-access-h8s76\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.171541 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-utilities\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.171568 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-catalog-content\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.191862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s76\" (UniqueName: \"kubernetes.io/projected/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-kube-api-access-h8s76\") pod \"redhat-operators-gk7v6\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.300201 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:16 crc kubenswrapper[4962]: I1003 14:48:16.793444 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gk7v6"] Oct 03 14:48:17 crc kubenswrapper[4962]: I1003 14:48:17.461812 4962 generic.go:334] "Generic (PLEG): container finished" podID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerID="e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6" exitCode=0 Oct 03 14:48:17 crc kubenswrapper[4962]: I1003 14:48:17.462013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk7v6" event={"ID":"6dec6b6c-058f-44ed-ab58-3b045b5d6acd","Type":"ContainerDied","Data":"e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6"} Oct 03 14:48:17 crc kubenswrapper[4962]: I1003 14:48:17.462144 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk7v6" event={"ID":"6dec6b6c-058f-44ed-ab58-3b045b5d6acd","Type":"ContainerStarted","Data":"36b5054e7309319b43b5d9d6fecde5c7b85ecbc530d5c21d7ee83bdb1a837f58"} Oct 03 14:48:18 crc kubenswrapper[4962]: I1003 14:48:18.475152 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk7v6" event={"ID":"6dec6b6c-058f-44ed-ab58-3b045b5d6acd","Type":"ContainerStarted","Data":"041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd"} Oct 03 14:48:23 crc kubenswrapper[4962]: I1003 14:48:23.533955 4962 generic.go:334] "Generic (PLEG): container finished" podID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerID="041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd" exitCode=0 Oct 03 14:48:23 crc kubenswrapper[4962]: I1003 14:48:23.534029 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk7v6" event={"ID":"6dec6b6c-058f-44ed-ab58-3b045b5d6acd","Type":"ContainerDied","Data":"041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd"} Oct 03 14:48:24 crc kubenswrapper[4962]: I1003 14:48:24.046933 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-v748j"] Oct 03 14:48:24 crc kubenswrapper[4962]: I1003 14:48:24.057753 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-v748j"] Oct 03 14:48:24 crc kubenswrapper[4962]: I1003 14:48:24.242092 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2edecf6-a382-402a-ad10-8776c6c34c81" path="/var/lib/kubelet/pods/e2edecf6-a382-402a-ad10-8776c6c34c81/volumes" Oct 03 14:48:24 crc kubenswrapper[4962]: I1003 14:48:24.549268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk7v6" event={"ID":"6dec6b6c-058f-44ed-ab58-3b045b5d6acd","Type":"ContainerStarted","Data":"c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1"} Oct 03 14:48:24 crc kubenswrapper[4962]: I1003 14:48:24.568086 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gk7v6" podStartSLOduration=3.032951083 podStartE2EDuration="9.568066359s" podCreationTimestamp="2025-10-03 14:48:15 +0000 UTC" firstStartedPulling="2025-10-03 14:48:17.463993758 +0000 UTC m=+7105.867891593" lastFinishedPulling="2025-10-03 14:48:23.999109034 +0000 UTC m=+7112.403006869" observedRunningTime="2025-10-03 14:48:24.565688836 +0000 UTC m=+7112.969586711" watchObservedRunningTime="2025-10-03 14:48:24.568066359 +0000 UTC m=+7112.971964194" Oct 03 14:48:24 crc kubenswrapper[4962]: I1003 14:48:24.660274 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:48:24 crc kubenswrapper[4962]: I1003 14:48:24.660604 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:48:26 crc kubenswrapper[4962]: I1003 14:48:26.301155 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:26 crc kubenswrapper[4962]: I1003 14:48:26.303005 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:27 crc kubenswrapper[4962]: I1003 14:48:27.365721 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gk7v6" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="registry-server" probeResult="failure" output=< Oct 03 14:48:27 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Oct 03 14:48:27 crc kubenswrapper[4962]: > Oct 03 14:48:34 crc kubenswrapper[4962]: I1003 14:48:34.030862 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-3d7c-account-create-w8922"] Oct 03 14:48:34 crc kubenswrapper[4962]: I1003 14:48:34.042194 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-3d7c-account-create-w8922"] Oct 03 14:48:34 crc kubenswrapper[4962]: I1003 14:48:34.242827 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd068190-fdd3-4767-a4b6-873321e9117e" path="/var/lib/kubelet/pods/dd068190-fdd3-4767-a4b6-873321e9117e/volumes" Oct 03 14:48:36 crc kubenswrapper[4962]: I1003 14:48:36.348433 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:36 crc kubenswrapper[4962]: I1003 14:48:36.397296 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:36 crc kubenswrapper[4962]: I1003 14:48:36.591030 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gk7v6"] Oct 03 14:48:37 crc kubenswrapper[4962]: I1003 14:48:37.698714 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gk7v6" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="registry-server" containerID="cri-o://c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1" gracePeriod=2 Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.200890 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.351564 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8s76\" (UniqueName: \"kubernetes.io/projected/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-kube-api-access-h8s76\") pod \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.351726 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-catalog-content\") pod \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.351901 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-utilities\") pod \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\" (UID: \"6dec6b6c-058f-44ed-ab58-3b045b5d6acd\") " Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.352965 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-utilities" (OuterVolumeSpecName: "utilities") pod "6dec6b6c-058f-44ed-ab58-3b045b5d6acd" (UID: "6dec6b6c-058f-44ed-ab58-3b045b5d6acd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.372991 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-kube-api-access-h8s76" (OuterVolumeSpecName: "kube-api-access-h8s76") pod "6dec6b6c-058f-44ed-ab58-3b045b5d6acd" (UID: "6dec6b6c-058f-44ed-ab58-3b045b5d6acd"). InnerVolumeSpecName "kube-api-access-h8s76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.433477 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dec6b6c-058f-44ed-ab58-3b045b5d6acd" (UID: "6dec6b6c-058f-44ed-ab58-3b045b5d6acd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.459416 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.459450 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8s76\" (UniqueName: \"kubernetes.io/projected/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-kube-api-access-h8s76\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.459464 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dec6b6c-058f-44ed-ab58-3b045b5d6acd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.708915 4962 generic.go:334] "Generic (PLEG): container finished" podID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerID="c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1" exitCode=0 Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.709017 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk7v6" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.709023 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk7v6" event={"ID":"6dec6b6c-058f-44ed-ab58-3b045b5d6acd","Type":"ContainerDied","Data":"c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1"} Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.709343 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk7v6" event={"ID":"6dec6b6c-058f-44ed-ab58-3b045b5d6acd","Type":"ContainerDied","Data":"36b5054e7309319b43b5d9d6fecde5c7b85ecbc530d5c21d7ee83bdb1a837f58"} Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.709378 4962 scope.go:117] "RemoveContainer" containerID="c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.738702 4962 scope.go:117] "RemoveContainer" containerID="041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.756229 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gk7v6"] Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.763690 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gk7v6"] Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.781146 4962 scope.go:117] "RemoveContainer" containerID="e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.827193 4962 scope.go:117] "RemoveContainer" containerID="c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1" Oct 03 14:48:38 crc kubenswrapper[4962]: E1003 14:48:38.829131 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1\": container with ID starting with c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1 not found: ID does not exist" containerID="c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.829176 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1"} err="failed to get container status \"c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1\": rpc error: code = NotFound desc = could not find container \"c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1\": container with ID starting with c5d41a1108b6b07783f4be56aedd42249c2f4d8423213fdc71e680c7befb38b1 not found: ID does not exist" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.829203 4962 scope.go:117] "RemoveContainer" containerID="041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd" Oct 03 14:48:38 crc kubenswrapper[4962]: E1003 14:48:38.829588 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd\": container with ID starting with 041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd not found: ID does not exist" containerID="041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.829618 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd"} err="failed to get container status \"041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd\": rpc error: code = NotFound desc = could not find container \"041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd\": container with ID starting with 041d77369c586fc07eaa10d9690ae4f788f9bb94cb36a770800d7ff19b78addd not found: ID does not exist" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.829750 4962 scope.go:117] "RemoveContainer" containerID="e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6" Oct 03 14:48:38 crc kubenswrapper[4962]: E1003 14:48:38.830145 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6\": container with ID starting with e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6 not found: ID does not exist" containerID="e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6" Oct 03 14:48:38 crc kubenswrapper[4962]: I1003 14:48:38.830174 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6"} err="failed to get container status \"e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6\": rpc error: code = NotFound desc = could not find container \"e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6\": container with ID starting with e9a01000364ee0aa64dfeb032307221ea38494209fe3d92f58ba41da7f551da6 not found: ID does not exist" Oct 03 14:48:40 crc kubenswrapper[4962]: I1003 14:48:40.243118 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" path="/var/lib/kubelet/pods/6dec6b6c-058f-44ed-ab58-3b045b5d6acd/volumes" Oct 03 14:48:49 crc kubenswrapper[4962]: I1003 14:48:49.040201 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-cmwqv"] Oct 03 14:48:49 crc kubenswrapper[4962]: I1003 14:48:49.048877 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-cmwqv"] Oct 03 14:48:50 crc kubenswrapper[4962]: I1003 14:48:50.242537 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2880ced7-28e6-476c-94cb-72a9d7a7f33e" path="/var/lib/kubelet/pods/2880ced7-28e6-476c-94cb-72a9d7a7f33e/volumes" Oct 03 14:48:54 crc kubenswrapper[4962]: I1003 14:48:54.659990 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:48:54 crc kubenswrapper[4962]: I1003 14:48:54.660614 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:49:12 crc kubenswrapper[4962]: I1003 14:49:12.045376 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-brhld"] Oct 03 14:49:12 crc kubenswrapper[4962]: I1003 14:49:12.057403 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-brhld"] Oct 03 14:49:12 crc kubenswrapper[4962]: I1003 14:49:12.243634 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484190c6-e084-4d4a-9f78-a1d6a7c0d814" path="/var/lib/kubelet/pods/484190c6-e084-4d4a-9f78-a1d6a7c0d814/volumes" Oct 03 14:49:13 crc kubenswrapper[4962]: I1003 14:49:13.490263 4962 scope.go:117] "RemoveContainer" containerID="b6c9a3d3cfd5d26a7f7f5cc088e51968efbfbfd86ed2c6329733b1c6d2bba66a" Oct 03 14:49:13 crc kubenswrapper[4962]: I1003 14:49:13.526296 4962 scope.go:117] "RemoveContainer" containerID="d04390e033c630a90a7c81819158cf7efb26397c2099d41f2ebc31161f2e23f5" Oct 03 14:49:13 crc kubenswrapper[4962]: I1003 14:49:13.574549 4962 scope.go:117] "RemoveContainer" containerID="6b44bb465493cfe4b0f061bd45787e36399016e18348496648a91b2c858e3def" Oct 03 14:49:13 crc kubenswrapper[4962]: I1003 14:49:13.619832 4962 scope.go:117] "RemoveContainer" containerID="071c2a146cfa412b6979bbe0fdeb312fd8456a1c9ce3044aecbd9e45a12300c1" Oct 03 14:49:22 crc kubenswrapper[4962]: I1003 14:49:22.026590 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-7b9d-account-create-87dh8"] Oct 03 14:49:22 crc kubenswrapper[4962]: I1003 14:49:22.037889 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-7b9d-account-create-87dh8"] Oct 03 14:49:22 crc kubenswrapper[4962]: I1003 14:49:22.239992 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac5b7b7-db12-4364-9220-05368465d505" path="/var/lib/kubelet/pods/0ac5b7b7-db12-4364-9220-05368465d505/volumes" Oct 03 14:49:24 crc kubenswrapper[4962]: I1003 14:49:24.659554 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:49:24 crc kubenswrapper[4962]: I1003 14:49:24.659940 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:49:24 crc kubenswrapper[4962]: I1003 14:49:24.659982 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:49:24 crc kubenswrapper[4962]: I1003 14:49:24.660731 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:49:24 crc kubenswrapper[4962]: I1003 14:49:24.660788 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" gracePeriod=600 Oct 03 14:49:24 crc kubenswrapper[4962]: E1003 14:49:24.779630 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:49:25 crc kubenswrapper[4962]: I1003 14:49:25.139553 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" exitCode=0 Oct 03 14:49:25 crc kubenswrapper[4962]: I1003 14:49:25.139592 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e"} Oct 03 14:49:25 crc kubenswrapper[4962]: I1003 14:49:25.139620 4962 scope.go:117] "RemoveContainer" containerID="2456cc5807e33f67351b02ef43a646828ba891bf3f306896eda2c0ba865fddfd" Oct 03 14:49:25 crc kubenswrapper[4962]: I1003 14:49:25.140468 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:49:25 crc kubenswrapper[4962]: E1003 14:49:25.140772 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:49:36 crc kubenswrapper[4962]: I1003 14:49:36.036376 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-sggtq"] Oct 03 14:49:36 crc kubenswrapper[4962]: I1003 14:49:36.047585 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-sggtq"] Oct 03 14:49:36 crc kubenswrapper[4962]: I1003 14:49:36.242971 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b183145f-a6ac-4f76-ac84-5c237e065e37" path="/var/lib/kubelet/pods/b183145f-a6ac-4f76-ac84-5c237e065e37/volumes" Oct 03 14:49:38 crc kubenswrapper[4962]: I1003 14:49:38.228507 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:49:38 crc kubenswrapper[4962]: E1003 14:49:38.229449 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:49:50 crc kubenswrapper[4962]: I1003 14:49:50.228760 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:49:50 crc kubenswrapper[4962]: E1003 14:49:50.229555 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:50:02 crc kubenswrapper[4962]: I1003 14:50:02.233625 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:50:02 crc kubenswrapper[4962]: E1003 14:50:02.234694 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:50:13 crc kubenswrapper[4962]: I1003 14:50:13.776600 4962 scope.go:117] "RemoveContainer" containerID="50365993e76f92599473046ebfd35ae09eecadcc6a3f03181b68f0072f91f5f7" Oct 03 14:50:13 crc kubenswrapper[4962]: I1003 14:50:13.799963 4962 scope.go:117] "RemoveContainer" containerID="e3c408a217b48e113de3b5b561824c4222df905dae4b8334dce68e1ff9cc44c2" Oct 03 14:50:15 crc kubenswrapper[4962]: I1003 14:50:15.228142 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:50:15 crc kubenswrapper[4962]: E1003 14:50:15.229095 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:50:29 crc kubenswrapper[4962]: I1003 14:50:29.226896 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:50:29 crc kubenswrapper[4962]: E1003 14:50:29.227955 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:50:42 crc kubenswrapper[4962]: I1003 14:50:42.233779 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:50:42 crc kubenswrapper[4962]: E1003 14:50:42.234656 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:50:54 crc kubenswrapper[4962]: I1003 14:50:54.227267 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:50:54 crc kubenswrapper[4962]: E1003 14:50:54.228406 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:51:08 crc kubenswrapper[4962]: I1003 14:51:08.227993 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:51:08 crc kubenswrapper[4962]: E1003 14:51:08.228872 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:51:21 crc kubenswrapper[4962]: I1003 14:51:21.229182 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:51:21 crc kubenswrapper[4962]: E1003 14:51:21.230611 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:51:32 crc kubenswrapper[4962]: I1003 14:51:32.233823 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:51:32 crc kubenswrapper[4962]: E1003 14:51:32.234627 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:51:44 crc kubenswrapper[4962]: I1003 14:51:44.229847 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:51:44 crc kubenswrapper[4962]: E1003 14:51:44.230756 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:51:56 crc kubenswrapper[4962]: I1003 14:51:56.228356 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:51:56 crc kubenswrapper[4962]: E1003 14:51:56.229225 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:52:11 crc kubenswrapper[4962]: I1003 14:52:11.227125 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:52:11 crc kubenswrapper[4962]: E1003 14:52:11.227981 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:52:23 crc kubenswrapper[4962]: I1003 14:52:23.227083 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:52:23 crc kubenswrapper[4962]: E1003 14:52:23.227889 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:52:31 crc kubenswrapper[4962]: I1003 14:52:31.929511 4962 generic.go:334] "Generic (PLEG): container finished" podID="429b4a0d-c23b-4c71-8844-7ec3cd482a4f" containerID="c9c086dde3c6b8c570f2df74b05540a9e4c4dc12e43ea9bf9bfe9bdccd5248d7" exitCode=0 Oct 03 14:52:31 crc kubenswrapper[4962]: I1003 14:52:31.929582 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" event={"ID":"429b4a0d-c23b-4c71-8844-7ec3cd482a4f","Type":"ContainerDied","Data":"c9c086dde3c6b8c570f2df74b05540a9e4c4dc12e43ea9bf9bfe9bdccd5248d7"} Oct 03 14:52:32 crc kubenswrapper[4962]: E1003 14:52:32.114175 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod429b4a0d_c23b_4c71_8844_7ec3cd482a4f.slice/crio-c9c086dde3c6b8c570f2df74b05540a9e4c4dc12e43ea9bf9bfe9bdccd5248d7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod429b4a0d_c23b_4c71_8844_7ec3cd482a4f.slice/crio-conmon-c9c086dde3c6b8c570f2df74b05540a9e4c4dc12e43ea9bf9bfe9bdccd5248d7.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.372946 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.428970 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ceph\") pod \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.429083 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqsf8\" (UniqueName: \"kubernetes.io/projected/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-kube-api-access-mqsf8\") pod \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.429182 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-inventory\") pod \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.429208 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-tripleo-cleanup-combined-ca-bundle\") pod \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.429237 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ssh-key\") pod \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\" (UID: \"429b4a0d-c23b-4c71-8844-7ec3cd482a4f\") " Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.435240 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "429b4a0d-c23b-4c71-8844-7ec3cd482a4f" (UID: "429b4a0d-c23b-4c71-8844-7ec3cd482a4f"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.435983 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-kube-api-access-mqsf8" (OuterVolumeSpecName: "kube-api-access-mqsf8") pod "429b4a0d-c23b-4c71-8844-7ec3cd482a4f" (UID: "429b4a0d-c23b-4c71-8844-7ec3cd482a4f"). InnerVolumeSpecName "kube-api-access-mqsf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.440371 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ceph" (OuterVolumeSpecName: "ceph") pod "429b4a0d-c23b-4c71-8844-7ec3cd482a4f" (UID: "429b4a0d-c23b-4c71-8844-7ec3cd482a4f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.468176 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "429b4a0d-c23b-4c71-8844-7ec3cd482a4f" (UID: "429b4a0d-c23b-4c71-8844-7ec3cd482a4f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.469316 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-inventory" (OuterVolumeSpecName: "inventory") pod "429b4a0d-c23b-4c71-8844-7ec3cd482a4f" (UID: "429b4a0d-c23b-4c71-8844-7ec3cd482a4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.532396 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.532806 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqsf8\" (UniqueName: \"kubernetes.io/projected/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-kube-api-access-mqsf8\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.532907 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.532990 4962 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.533075 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/429b4a0d-c23b-4c71-8844-7ec3cd482a4f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.947074 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" event={"ID":"429b4a0d-c23b-4c71-8844-7ec3cd482a4f","Type":"ContainerDied","Data":"8c04daab3a33ebdb47eb4d52336c003ed13c86b47e86cc771a68adb9d5f2c3e1"} Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.947378 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c04daab3a33ebdb47eb4d52336c003ed13c86b47e86cc771a68adb9d5f2c3e1" Oct 03 14:52:33 crc kubenswrapper[4962]: I1003 14:52:33.947108 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh" Oct 03 14:52:38 crc kubenswrapper[4962]: I1003 14:52:38.227821 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:52:38 crc kubenswrapper[4962]: E1003 14:52:38.228481 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.341532 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wvgsq"] Oct 03 14:52:39 crc kubenswrapper[4962]: E1003 14:52:39.342507 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="extract-content" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.342522 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="extract-content" Oct 03 14:52:39 crc kubenswrapper[4962]: E1003 14:52:39.342541 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429b4a0d-c23b-4c71-8844-7ec3cd482a4f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.342548 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="429b4a0d-c23b-4c71-8844-7ec3cd482a4f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 03 14:52:39 crc kubenswrapper[4962]: E1003 14:52:39.342569 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="registry-server" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.342575 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="registry-server" Oct 03 14:52:39 crc kubenswrapper[4962]: E1003 14:52:39.342598 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="extract-utilities" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.342604 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="extract-utilities" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.342825 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dec6b6c-058f-44ed-ab58-3b045b5d6acd" containerName="registry-server" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.342842 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="429b4a0d-c23b-4c71-8844-7ec3cd482a4f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.343556 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.346704 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.346789 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.346875 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.347699 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.364401 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wvgsq"] Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.462039 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ceph\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.462578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.462625 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pgnd\" (UniqueName: \"kubernetes.io/projected/497c3d70-7959-49ea-9acb-d8bd2f301d0a-kube-api-access-4pgnd\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.462671 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-inventory\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.462708 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.565021 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ceph\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.565187 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.565231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pgnd\" (UniqueName: \"kubernetes.io/projected/497c3d70-7959-49ea-9acb-d8bd2f301d0a-kube-api-access-4pgnd\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.565250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-inventory\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.565282 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.571430 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.571547 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-inventory\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.571891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ceph\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.574060 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.580212 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pgnd\" (UniqueName: \"kubernetes.io/projected/497c3d70-7959-49ea-9acb-d8bd2f301d0a-kube-api-access-4pgnd\") pod \"bootstrap-openstack-openstack-cell1-wvgsq\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:39 crc kubenswrapper[4962]: I1003 14:52:39.665096 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:52:40 crc kubenswrapper[4962]: I1003 14:52:40.281723 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:52:40 crc kubenswrapper[4962]: I1003 14:52:40.296458 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wvgsq"] Oct 03 14:52:41 crc kubenswrapper[4962]: I1003 14:52:41.048106 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" event={"ID":"497c3d70-7959-49ea-9acb-d8bd2f301d0a","Type":"ContainerStarted","Data":"68dc43899e84ff0a96072a3c2dd7f7a83b7fa02aecb48f39e709b5edd023e3ae"} Oct 03 14:52:41 crc kubenswrapper[4962]: I1003 14:52:41.048472 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" event={"ID":"497c3d70-7959-49ea-9acb-d8bd2f301d0a","Type":"ContainerStarted","Data":"09c155ea7f772050ae938aa8b97355cc816ac56b4ec24f88504bb88477188d2c"} Oct 03 14:52:41 crc kubenswrapper[4962]: I1003 14:52:41.086236 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" podStartSLOduration=1.883565983 podStartE2EDuration="2.086213778s" podCreationTimestamp="2025-10-03 14:52:39 +0000 UTC" firstStartedPulling="2025-10-03 14:52:40.278738072 +0000 UTC m=+7368.682635907" lastFinishedPulling="2025-10-03 14:52:40.481385867 +0000 UTC m=+7368.885283702" observedRunningTime="2025-10-03 14:52:41.077915377 +0000 UTC m=+7369.481813222" watchObservedRunningTime="2025-10-03 14:52:41.086213778 +0000 UTC m=+7369.490111613" Oct 03 14:52:49 crc kubenswrapper[4962]: I1003 14:52:49.228113 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:52:49 crc kubenswrapper[4962]: E1003 14:52:49.229194 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:53:03 crc kubenswrapper[4962]: I1003 14:53:03.226843 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:53:03 crc kubenswrapper[4962]: E1003 14:53:03.227675 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:53:16 crc kubenswrapper[4962]: I1003 14:53:16.228191 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:53:16 crc kubenswrapper[4962]: E1003 14:53:16.229153 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:53:29 crc kubenswrapper[4962]: I1003 14:53:29.227162 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:53:29 crc kubenswrapper[4962]: E1003 14:53:29.227965 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:53:44 crc kubenswrapper[4962]: I1003 14:53:44.234614 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:53:44 crc kubenswrapper[4962]: E1003 14:53:44.235270 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:53:59 crc kubenswrapper[4962]: I1003 14:53:59.227535 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:53:59 crc kubenswrapper[4962]: E1003 14:53:59.228378 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:54:11 crc kubenswrapper[4962]: I1003 14:54:11.227795 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:54:11 crc kubenswrapper[4962]: E1003 14:54:11.228866 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 14:54:25 crc kubenswrapper[4962]: I1003 14:54:25.227559 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:54:26 crc kubenswrapper[4962]: I1003 14:54:26.070860 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"4a92d28487ead87bf551d56d7eaa01560a202df6439181d7e00d2ba06e2d0351"} Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.036915 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kg2sk"] Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.040550 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.046015 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kg2sk"] Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.106979 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6fm\" (UniqueName: \"kubernetes.io/projected/71cb91ac-95ba-464f-af62-94cdc23e255a-kube-api-access-vc6fm\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.107114 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-catalog-content\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.107147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-utilities\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.209077 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-catalog-content\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.209129 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-utilities\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.209285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6fm\" (UniqueName: \"kubernetes.io/projected/71cb91ac-95ba-464f-af62-94cdc23e255a-kube-api-access-vc6fm\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.209850 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-catalog-content\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.210055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-utilities\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.233289 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6fm\" (UniqueName: \"kubernetes.io/projected/71cb91ac-95ba-464f-af62-94cdc23e255a-kube-api-access-vc6fm\") pod \"certified-operators-kg2sk\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.362879 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:07 crc kubenswrapper[4962]: I1003 14:55:07.929082 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kg2sk"] Oct 03 14:55:08 crc kubenswrapper[4962]: I1003 14:55:08.503652 4962 generic.go:334] "Generic (PLEG): container finished" podID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerID="65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a" exitCode=0 Oct 03 14:55:08 crc kubenswrapper[4962]: I1003 14:55:08.503763 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kg2sk" event={"ID":"71cb91ac-95ba-464f-af62-94cdc23e255a","Type":"ContainerDied","Data":"65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a"} Oct 03 14:55:08 crc kubenswrapper[4962]: I1003 14:55:08.503900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kg2sk" event={"ID":"71cb91ac-95ba-464f-af62-94cdc23e255a","Type":"ContainerStarted","Data":"33837c40595c568382297e2db885e40eea4252ec154e7c8d492499e91c69b710"} Oct 03 14:55:09 crc kubenswrapper[4962]: I1003 14:55:09.513941 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kg2sk" event={"ID":"71cb91ac-95ba-464f-af62-94cdc23e255a","Type":"ContainerStarted","Data":"773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35"} Oct 03 14:55:10 crc kubenswrapper[4962]: I1003 14:55:10.525354 4962 generic.go:334] "Generic (PLEG): container finished" podID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerID="773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35" exitCode=0 Oct 03 14:55:10 crc kubenswrapper[4962]: I1003 14:55:10.527680 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kg2sk" event={"ID":"71cb91ac-95ba-464f-af62-94cdc23e255a","Type":"ContainerDied","Data":"773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35"} Oct 03 14:55:11 crc kubenswrapper[4962]: I1003 14:55:11.539160 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kg2sk" event={"ID":"71cb91ac-95ba-464f-af62-94cdc23e255a","Type":"ContainerStarted","Data":"38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8"} Oct 03 14:55:11 crc kubenswrapper[4962]: I1003 14:55:11.556880 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kg2sk" podStartSLOduration=2.103002158 podStartE2EDuration="4.556857734s" podCreationTimestamp="2025-10-03 14:55:07 +0000 UTC" firstStartedPulling="2025-10-03 14:55:08.505494192 +0000 UTC m=+7516.909392027" lastFinishedPulling="2025-10-03 14:55:10.959349768 +0000 UTC m=+7519.363247603" observedRunningTime="2025-10-03 14:55:11.553991527 +0000 UTC m=+7519.957889362" watchObservedRunningTime="2025-10-03 14:55:11.556857734 +0000 UTC m=+7519.960755579" Oct 03 14:55:17 crc kubenswrapper[4962]: I1003 14:55:17.363588 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:17 crc kubenswrapper[4962]: I1003 14:55:17.364216 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:17 crc kubenswrapper[4962]: I1003 14:55:17.425750 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:17 crc kubenswrapper[4962]: I1003 14:55:17.638265 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:17 crc kubenswrapper[4962]: I1003 14:55:17.680576 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kg2sk"] Oct 03 14:55:19 crc kubenswrapper[4962]: I1003 14:55:19.609542 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kg2sk" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerName="registry-server" containerID="cri-o://38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8" gracePeriod=2 Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.118054 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.209731 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6fm\" (UniqueName: \"kubernetes.io/projected/71cb91ac-95ba-464f-af62-94cdc23e255a-kube-api-access-vc6fm\") pod \"71cb91ac-95ba-464f-af62-94cdc23e255a\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.210004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-catalog-content\") pod \"71cb91ac-95ba-464f-af62-94cdc23e255a\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.210135 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-utilities\") pod \"71cb91ac-95ba-464f-af62-94cdc23e255a\" (UID: \"71cb91ac-95ba-464f-af62-94cdc23e255a\") " Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.210902 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-utilities" (OuterVolumeSpecName: "utilities") pod "71cb91ac-95ba-464f-af62-94cdc23e255a" (UID: "71cb91ac-95ba-464f-af62-94cdc23e255a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.216513 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cb91ac-95ba-464f-af62-94cdc23e255a-kube-api-access-vc6fm" (OuterVolumeSpecName: "kube-api-access-vc6fm") pod "71cb91ac-95ba-464f-af62-94cdc23e255a" (UID: "71cb91ac-95ba-464f-af62-94cdc23e255a"). InnerVolumeSpecName "kube-api-access-vc6fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.254731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71cb91ac-95ba-464f-af62-94cdc23e255a" (UID: "71cb91ac-95ba-464f-af62-94cdc23e255a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.311680 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.311712 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71cb91ac-95ba-464f-af62-94cdc23e255a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.311722 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6fm\" (UniqueName: \"kubernetes.io/projected/71cb91ac-95ba-464f-af62-94cdc23e255a-kube-api-access-vc6fm\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.620450 4962 generic.go:334] "Generic (PLEG): container finished" podID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerID="38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8" exitCode=0 Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.620524 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kg2sk" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.622383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kg2sk" event={"ID":"71cb91ac-95ba-464f-af62-94cdc23e255a","Type":"ContainerDied","Data":"38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8"} Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.622567 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kg2sk" event={"ID":"71cb91ac-95ba-464f-af62-94cdc23e255a","Type":"ContainerDied","Data":"33837c40595c568382297e2db885e40eea4252ec154e7c8d492499e91c69b710"} Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.622605 4962 scope.go:117] "RemoveContainer" containerID="38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.648824 4962 scope.go:117] "RemoveContainer" containerID="773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.662926 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kg2sk"] Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.672880 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kg2sk"] Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.678577 4962 scope.go:117] "RemoveContainer" containerID="65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.713261 4962 scope.go:117] "RemoveContainer" containerID="38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8" Oct 03 14:55:20 crc kubenswrapper[4962]: E1003 14:55:20.713731 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8\": container with ID starting with 38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8 not found: ID does not exist" containerID="38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.713856 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8"} err="failed to get container status \"38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8\": rpc error: code = NotFound desc = could not find container \"38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8\": container with ID starting with 38fac42f9322a5b51fcf6a6a25b9cb3a916bcb42e4cf5cb71f1db72ee1a84ed8 not found: ID does not exist" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.713949 4962 scope.go:117] "RemoveContainer" containerID="773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35" Oct 03 14:55:20 crc kubenswrapper[4962]: E1003 14:55:20.714346 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35\": container with ID starting with 773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35 not found: ID does not exist" containerID="773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.714369 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35"} err="failed to get container status \"773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35\": rpc error: code = NotFound desc = could not find container \"773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35\": container with ID starting with 773a803eec754102027852167ddc13f688eece4d9da106a3ab22aaf62aac0c35 not found: ID does not exist" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.714383 4962 scope.go:117] "RemoveContainer" containerID="65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a" Oct 03 14:55:20 crc kubenswrapper[4962]: E1003 14:55:20.714619 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a\": container with ID starting with 65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a not found: ID does not exist" containerID="65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a" Oct 03 14:55:20 crc kubenswrapper[4962]: I1003 14:55:20.714731 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a"} err="failed to get container status \"65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a\": rpc error: code = NotFound desc = could not find container \"65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a\": container with ID starting with 65e7bb1a35b918a7a41d948d958ea006575a154caf195d1d41d761f76a857b8a not found: ID does not exist" Oct 03 14:55:22 crc kubenswrapper[4962]: I1003 14:55:22.239874 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" path="/var/lib/kubelet/pods/71cb91ac-95ba-464f-af62-94cdc23e255a/volumes" Oct 03 14:55:47 crc kubenswrapper[4962]: I1003 14:55:47.941365 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pgwdz"] Oct 03 14:55:47 crc kubenswrapper[4962]: E1003 14:55:47.942398 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerName="registry-server" Oct 03 14:55:47 crc kubenswrapper[4962]: I1003 14:55:47.942413 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerName="registry-server" Oct 03 14:55:47 crc kubenswrapper[4962]: E1003 14:55:47.942453 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerName="extract-content" Oct 03 14:55:47 crc kubenswrapper[4962]: I1003 14:55:47.942460 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerName="extract-content" Oct 03 14:55:47 crc kubenswrapper[4962]: E1003 14:55:47.942476 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerName="extract-utilities" Oct 03 14:55:47 crc kubenswrapper[4962]: I1003 14:55:47.942482 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerName="extract-utilities" Oct 03 14:55:47 crc kubenswrapper[4962]: I1003 14:55:47.942712 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cb91ac-95ba-464f-af62-94cdc23e255a" containerName="registry-server" Oct 03 14:55:47 crc kubenswrapper[4962]: I1003 14:55:47.944351 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:47 crc kubenswrapper[4962]: I1003 14:55:47.956718 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgwdz"] Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.055344 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-catalog-content\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.055439 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-utilities\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.055725 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2h2\" (UniqueName: \"kubernetes.io/projected/0f131e30-c2c9-411c-9bff-620970ba0a23-kube-api-access-hv2h2\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.157810 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-utilities\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.157902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2h2\" (UniqueName: \"kubernetes.io/projected/0f131e30-c2c9-411c-9bff-620970ba0a23-kube-api-access-hv2h2\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.158063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-catalog-content\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.158551 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-utilities\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.158573 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-catalog-content\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.176296 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2h2\" (UniqueName: \"kubernetes.io/projected/0f131e30-c2c9-411c-9bff-620970ba0a23-kube-api-access-hv2h2\") pod \"community-operators-pgwdz\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.305798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.775243 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgwdz"] Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.904193 4962 generic.go:334] "Generic (PLEG): container finished" podID="497c3d70-7959-49ea-9acb-d8bd2f301d0a" containerID="68dc43899e84ff0a96072a3c2dd7f7a83b7fa02aecb48f39e709b5edd023e3ae" exitCode=0 Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.904263 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" event={"ID":"497c3d70-7959-49ea-9acb-d8bd2f301d0a","Type":"ContainerDied","Data":"68dc43899e84ff0a96072a3c2dd7f7a83b7fa02aecb48f39e709b5edd023e3ae"} Oct 03 14:55:48 crc kubenswrapper[4962]: I1003 14:55:48.907922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgwdz" event={"ID":"0f131e30-c2c9-411c-9bff-620970ba0a23","Type":"ContainerStarted","Data":"be3089a2143ca98ca1fd3444624b21e596a41c7777f93f265ab00346e6918c58"} Oct 03 14:55:49 crc kubenswrapper[4962]: I1003 14:55:49.918577 4962 generic.go:334] "Generic (PLEG): container finished" podID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerID="d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09" exitCode=0 Oct 03 14:55:49 crc kubenswrapper[4962]: I1003 14:55:49.918664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgwdz" event={"ID":"0f131e30-c2c9-411c-9bff-620970ba0a23","Type":"ContainerDied","Data":"d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09"} Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.320806 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.415322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-inventory\") pod \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.416035 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pgnd\" (UniqueName: \"kubernetes.io/projected/497c3d70-7959-49ea-9acb-d8bd2f301d0a-kube-api-access-4pgnd\") pod \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.416069 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ceph\") pod \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.416199 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ssh-key\") pod \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.416462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-bootstrap-combined-ca-bundle\") pod \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\" (UID: \"497c3d70-7959-49ea-9acb-d8bd2f301d0a\") " Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.422175 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ceph" (OuterVolumeSpecName: "ceph") pod "497c3d70-7959-49ea-9acb-d8bd2f301d0a" (UID: "497c3d70-7959-49ea-9acb-d8bd2f301d0a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.422305 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497c3d70-7959-49ea-9acb-d8bd2f301d0a-kube-api-access-4pgnd" (OuterVolumeSpecName: "kube-api-access-4pgnd") pod "497c3d70-7959-49ea-9acb-d8bd2f301d0a" (UID: "497c3d70-7959-49ea-9acb-d8bd2f301d0a"). InnerVolumeSpecName "kube-api-access-4pgnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.423704 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "497c3d70-7959-49ea-9acb-d8bd2f301d0a" (UID: "497c3d70-7959-49ea-9acb-d8bd2f301d0a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.445878 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-inventory" (OuterVolumeSpecName: "inventory") pod "497c3d70-7959-49ea-9acb-d8bd2f301d0a" (UID: "497c3d70-7959-49ea-9acb-d8bd2f301d0a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.447972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "497c3d70-7959-49ea-9acb-d8bd2f301d0a" (UID: "497c3d70-7959-49ea-9acb-d8bd2f301d0a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.519408 4962 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.519445 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.519458 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pgnd\" (UniqueName: \"kubernetes.io/projected/497c3d70-7959-49ea-9acb-d8bd2f301d0a-kube-api-access-4pgnd\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.519466 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.519474 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/497c3d70-7959-49ea-9acb-d8bd2f301d0a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.931004 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgwdz" event={"ID":"0f131e30-c2c9-411c-9bff-620970ba0a23","Type":"ContainerStarted","Data":"a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0"} Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.936584 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" event={"ID":"497c3d70-7959-49ea-9acb-d8bd2f301d0a","Type":"ContainerDied","Data":"09c155ea7f772050ae938aa8b97355cc816ac56b4ec24f88504bb88477188d2c"} Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.936623 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c155ea7f772050ae938aa8b97355cc816ac56b4ec24f88504bb88477188d2c" Oct 03 14:55:50 crc kubenswrapper[4962]: I1003 14:55:50.936707 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wvgsq" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.017266 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-kv9sn"] Oct 03 14:55:51 crc kubenswrapper[4962]: E1003 14:55:51.018147 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497c3d70-7959-49ea-9acb-d8bd2f301d0a" containerName="bootstrap-openstack-openstack-cell1" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.018165 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="497c3d70-7959-49ea-9acb-d8bd2f301d0a" containerName="bootstrap-openstack-openstack-cell1" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.018344 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="497c3d70-7959-49ea-9acb-d8bd2f301d0a" containerName="bootstrap-openstack-openstack-cell1" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.019707 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.023367 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.023780 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.024289 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.025286 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.031201 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-kv9sn"] Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.134383 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z9wj\" (UniqueName: \"kubernetes.io/projected/1ea37c03-872a-453b-8f55-42d377fa11ad-kube-api-access-7z9wj\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.134511 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ssh-key\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.134557 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ceph\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.134727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-inventory\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.236615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9wj\" (UniqueName: \"kubernetes.io/projected/1ea37c03-872a-453b-8f55-42d377fa11ad-kube-api-access-7z9wj\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.236753 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ssh-key\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.236797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ceph\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.236900 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-inventory\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.241817 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ssh-key\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.242180 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-inventory\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.243276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ceph\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.253006 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z9wj\" (UniqueName: \"kubernetes.io/projected/1ea37c03-872a-453b-8f55-42d377fa11ad-kube-api-access-7z9wj\") pod \"download-cache-openstack-openstack-cell1-kv9sn\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.338075 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.738357 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b56jg"] Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.741742 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.748582 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b56jg"] Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.850630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-catalog-content\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.850707 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-utilities\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.850730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmtsx\" (UniqueName: \"kubernetes.io/projected/1f9fe961-45db-48ac-a72e-33146c0079dd-kube-api-access-wmtsx\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.899228 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-kv9sn"] Oct 03 14:55:51 crc kubenswrapper[4962]: W1003 14:55:51.908926 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ea37c03_872a_453b_8f55_42d377fa11ad.slice/crio-4e3fe90cb69aa7815e0533b47931a9346bfb4570f96938dce396cb97b9ae2bb9 WatchSource:0}: Error finding container 4e3fe90cb69aa7815e0533b47931a9346bfb4570f96938dce396cb97b9ae2bb9: Status 404 returned error can't find the container with id 4e3fe90cb69aa7815e0533b47931a9346bfb4570f96938dce396cb97b9ae2bb9 Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.950335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" event={"ID":"1ea37c03-872a-453b-8f55-42d377fa11ad","Type":"ContainerStarted","Data":"4e3fe90cb69aa7815e0533b47931a9346bfb4570f96938dce396cb97b9ae2bb9"} Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.953889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-catalog-content\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.954069 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-utilities\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.954147 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmtsx\" (UniqueName: \"kubernetes.io/projected/1f9fe961-45db-48ac-a72e-33146c0079dd-kube-api-access-wmtsx\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.954768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-catalog-content\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.954860 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-utilities\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:51 crc kubenswrapper[4962]: I1003 14:55:51.976165 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmtsx\" (UniqueName: \"kubernetes.io/projected/1f9fe961-45db-48ac-a72e-33146c0079dd-kube-api-access-wmtsx\") pod \"redhat-marketplace-b56jg\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:52 crc kubenswrapper[4962]: I1003 14:55:52.073280 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:55:52 crc kubenswrapper[4962]: I1003 14:55:52.596982 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b56jg"] Oct 03 14:55:52 crc kubenswrapper[4962]: W1003 14:55:52.603925 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f9fe961_45db_48ac_a72e_33146c0079dd.slice/crio-ec0e8ec0eff76ff3697656345ceb04c923b04e90f3a9a8cda6fa82ace09c4660 WatchSource:0}: Error finding container ec0e8ec0eff76ff3697656345ceb04c923b04e90f3a9a8cda6fa82ace09c4660: Status 404 returned error can't find the container with id ec0e8ec0eff76ff3697656345ceb04c923b04e90f3a9a8cda6fa82ace09c4660 Oct 03 14:55:52 crc kubenswrapper[4962]: I1003 14:55:52.962770 4962 generic.go:334] "Generic (PLEG): container finished" podID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerID="a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0" exitCode=0 Oct 03 14:55:52 crc kubenswrapper[4962]: I1003 14:55:52.962814 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgwdz" event={"ID":"0f131e30-c2c9-411c-9bff-620970ba0a23","Type":"ContainerDied","Data":"a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0"} Oct 03 14:55:52 crc kubenswrapper[4962]: I1003 14:55:52.969268 4962 generic.go:334] "Generic (PLEG): container finished" podID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerID="411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2" exitCode=0 Oct 03 14:55:52 crc kubenswrapper[4962]: I1003 14:55:52.969927 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b56jg" event={"ID":"1f9fe961-45db-48ac-a72e-33146c0079dd","Type":"ContainerDied","Data":"411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2"} Oct 03 14:55:52 crc kubenswrapper[4962]: I1003 14:55:52.970589 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b56jg" event={"ID":"1f9fe961-45db-48ac-a72e-33146c0079dd","Type":"ContainerStarted","Data":"ec0e8ec0eff76ff3697656345ceb04c923b04e90f3a9a8cda6fa82ace09c4660"} Oct 03 14:55:52 crc kubenswrapper[4962]: I1003 14:55:52.972857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" event={"ID":"1ea37c03-872a-453b-8f55-42d377fa11ad","Type":"ContainerStarted","Data":"2041378a49810d5b33e496a0479281b2879e31b50086a1e066e7baa2551ccf8b"} Oct 03 14:55:53 crc kubenswrapper[4962]: I1003 14:55:53.984454 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgwdz" event={"ID":"0f131e30-c2c9-411c-9bff-620970ba0a23","Type":"ContainerStarted","Data":"366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390"} Oct 03 14:55:54 crc kubenswrapper[4962]: I1003 14:55:54.006198 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" podStartSLOduration=3.813822644 podStartE2EDuration="4.006175054s" podCreationTimestamp="2025-10-03 14:55:50 +0000 UTC" firstStartedPulling="2025-10-03 14:55:51.913166352 +0000 UTC m=+7560.317064187" lastFinishedPulling="2025-10-03 14:55:52.105518762 +0000 UTC m=+7560.509416597" observedRunningTime="2025-10-03 14:55:53.020787642 +0000 UTC m=+7561.424685497" watchObservedRunningTime="2025-10-03 14:55:54.006175054 +0000 UTC m=+7562.410072889" Oct 03 14:55:54 crc kubenswrapper[4962]: I1003 14:55:54.010259 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pgwdz" podStartSLOduration=3.55206607 podStartE2EDuration="7.010240942s" podCreationTimestamp="2025-10-03 14:55:47 +0000 UTC" firstStartedPulling="2025-10-03 14:55:49.921983785 +0000 UTC m=+7558.325881620" lastFinishedPulling="2025-10-03 14:55:53.380158657 +0000 UTC m=+7561.784056492" observedRunningTime="2025-10-03 14:55:54.003181864 +0000 UTC m=+7562.407079969" watchObservedRunningTime="2025-10-03 14:55:54.010240942 +0000 UTC m=+7562.414138777" Oct 03 14:55:54 crc kubenswrapper[4962]: I1003 14:55:54.995975 4962 generic.go:334] "Generic (PLEG): container finished" podID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerID="14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8" exitCode=0 Oct 03 14:55:54 crc kubenswrapper[4962]: I1003 14:55:54.996088 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b56jg" event={"ID":"1f9fe961-45db-48ac-a72e-33146c0079dd","Type":"ContainerDied","Data":"14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8"} Oct 03 14:55:56 crc kubenswrapper[4962]: I1003 14:55:56.008091 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b56jg" event={"ID":"1f9fe961-45db-48ac-a72e-33146c0079dd","Type":"ContainerStarted","Data":"7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9"} Oct 03 14:55:56 crc kubenswrapper[4962]: I1003 14:55:56.034210 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b56jg" podStartSLOduration=2.540711989 podStartE2EDuration="5.034189103s" podCreationTimestamp="2025-10-03 14:55:51 +0000 UTC" firstStartedPulling="2025-10-03 14:55:52.971849467 +0000 UTC m=+7561.375747302" lastFinishedPulling="2025-10-03 14:55:55.465326581 +0000 UTC m=+7563.869224416" observedRunningTime="2025-10-03 14:55:56.025699636 +0000 UTC m=+7564.429597471" watchObservedRunningTime="2025-10-03 14:55:56.034189103 +0000 UTC m=+7564.438086938" Oct 03 14:55:58 crc kubenswrapper[4962]: I1003 14:55:58.306866 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:58 crc kubenswrapper[4962]: I1003 14:55:58.308314 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:58 crc kubenswrapper[4962]: I1003 14:55:58.366533 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:55:59 crc kubenswrapper[4962]: I1003 14:55:59.080623 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:56:00 crc kubenswrapper[4962]: I1003 14:56:00.933824 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgwdz"] Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.061452 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pgwdz" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerName="registry-server" containerID="cri-o://366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390" gracePeriod=2 Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.073792 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.073939 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.123382 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.576814 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.693158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv2h2\" (UniqueName: \"kubernetes.io/projected/0f131e30-c2c9-411c-9bff-620970ba0a23-kube-api-access-hv2h2\") pod \"0f131e30-c2c9-411c-9bff-620970ba0a23\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.693228 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-catalog-content\") pod \"0f131e30-c2c9-411c-9bff-620970ba0a23\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.693385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-utilities\") pod \"0f131e30-c2c9-411c-9bff-620970ba0a23\" (UID: \"0f131e30-c2c9-411c-9bff-620970ba0a23\") " Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.694537 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-utilities" (OuterVolumeSpecName: "utilities") pod "0f131e30-c2c9-411c-9bff-620970ba0a23" (UID: "0f131e30-c2c9-411c-9bff-620970ba0a23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.699334 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f131e30-c2c9-411c-9bff-620970ba0a23-kube-api-access-hv2h2" (OuterVolumeSpecName: "kube-api-access-hv2h2") pod "0f131e30-c2c9-411c-9bff-620970ba0a23" (UID: "0f131e30-c2c9-411c-9bff-620970ba0a23"). InnerVolumeSpecName "kube-api-access-hv2h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.735743 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f131e30-c2c9-411c-9bff-620970ba0a23" (UID: "0f131e30-c2c9-411c-9bff-620970ba0a23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.796268 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv2h2\" (UniqueName: \"kubernetes.io/projected/0f131e30-c2c9-411c-9bff-620970ba0a23-kube-api-access-hv2h2\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.796305 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:02 crc kubenswrapper[4962]: I1003 14:56:02.796314 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f131e30-c2c9-411c-9bff-620970ba0a23-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.078335 4962 generic.go:334] "Generic (PLEG): container finished" podID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerID="366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390" exitCode=0 Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.078423 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgwdz" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.078462 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgwdz" event={"ID":"0f131e30-c2c9-411c-9bff-620970ba0a23","Type":"ContainerDied","Data":"366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390"} Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.079026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgwdz" event={"ID":"0f131e30-c2c9-411c-9bff-620970ba0a23","Type":"ContainerDied","Data":"be3089a2143ca98ca1fd3444624b21e596a41c7777f93f265ab00346e6918c58"} Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.079066 4962 scope.go:117] "RemoveContainer" containerID="366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.102615 4962 scope.go:117] "RemoveContainer" containerID="a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.125821 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgwdz"] Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.135210 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.135584 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pgwdz"] Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.144404 4962 scope.go:117] "RemoveContainer" containerID="d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.208281 4962 scope.go:117] "RemoveContainer" containerID="366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390" Oct 03 14:56:03 crc kubenswrapper[4962]: E1003 14:56:03.209004 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390\": container with ID starting with 366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390 not found: ID does not exist" containerID="366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.209060 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390"} err="failed to get container status \"366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390\": rpc error: code = NotFound desc = could not find container \"366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390\": container with ID starting with 366f32a3bb648e33e512de81f98c5d4b2dbc34c99e62aa7a1859c06146ab9390 not found: ID does not exist" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.209098 4962 scope.go:117] "RemoveContainer" containerID="a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0" Oct 03 14:56:03 crc kubenswrapper[4962]: E1003 14:56:03.209547 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0\": container with ID starting with a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0 not found: ID does not exist" containerID="a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.209605 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0"} err="failed to get container status \"a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0\": rpc error: code = NotFound desc = could not find container \"a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0\": container with ID starting with a924bdfa46495b6eb7b91ed37749ea6b27bafdf3261cb340cac7df7c6ff706b0 not found: ID does not exist" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.209661 4962 scope.go:117] "RemoveContainer" containerID="d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09" Oct 03 14:56:03 crc kubenswrapper[4962]: E1003 14:56:03.210257 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09\": container with ID starting with d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09 not found: ID does not exist" containerID="d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09" Oct 03 14:56:03 crc kubenswrapper[4962]: I1003 14:56:03.210325 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09"} err="failed to get container status \"d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09\": rpc error: code = NotFound desc = could not find container \"d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09\": container with ID starting with d6bddfdef4837b0d100c80147a6f659174f5db0259cd255e12c391d22aed1c09 not found: ID does not exist" Oct 03 14:56:04 crc kubenswrapper[4962]: I1003 14:56:04.243522 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" path="/var/lib/kubelet/pods/0f131e30-c2c9-411c-9bff-620970ba0a23/volumes" Oct 03 14:56:04 crc kubenswrapper[4962]: I1003 14:56:04.727229 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b56jg"] Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.099527 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b56jg" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerName="registry-server" containerID="cri-o://7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9" gracePeriod=2 Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.592426 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.662932 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-catalog-content\") pod \"1f9fe961-45db-48ac-a72e-33146c0079dd\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.663454 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmtsx\" (UniqueName: \"kubernetes.io/projected/1f9fe961-45db-48ac-a72e-33146c0079dd-kube-api-access-wmtsx\") pod \"1f9fe961-45db-48ac-a72e-33146c0079dd\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.663627 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-utilities\") pod \"1f9fe961-45db-48ac-a72e-33146c0079dd\" (UID: \"1f9fe961-45db-48ac-a72e-33146c0079dd\") " Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.664191 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-utilities" (OuterVolumeSpecName: "utilities") pod "1f9fe961-45db-48ac-a72e-33146c0079dd" (UID: "1f9fe961-45db-48ac-a72e-33146c0079dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.670627 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9fe961-45db-48ac-a72e-33146c0079dd-kube-api-access-wmtsx" (OuterVolumeSpecName: "kube-api-access-wmtsx") pod "1f9fe961-45db-48ac-a72e-33146c0079dd" (UID: "1f9fe961-45db-48ac-a72e-33146c0079dd"). InnerVolumeSpecName "kube-api-access-wmtsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.677190 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f9fe961-45db-48ac-a72e-33146c0079dd" (UID: "1f9fe961-45db-48ac-a72e-33146c0079dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.765587 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.765618 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9fe961-45db-48ac-a72e-33146c0079dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:05 crc kubenswrapper[4962]: I1003 14:56:05.765631 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmtsx\" (UniqueName: \"kubernetes.io/projected/1f9fe961-45db-48ac-a72e-33146c0079dd-kube-api-access-wmtsx\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.116470 4962 generic.go:334] "Generic (PLEG): container finished" podID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerID="7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9" exitCode=0 Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.116518 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b56jg" event={"ID":"1f9fe961-45db-48ac-a72e-33146c0079dd","Type":"ContainerDied","Data":"7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9"} Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.116551 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b56jg" event={"ID":"1f9fe961-45db-48ac-a72e-33146c0079dd","Type":"ContainerDied","Data":"ec0e8ec0eff76ff3697656345ceb04c923b04e90f3a9a8cda6fa82ace09c4660"} Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.116562 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b56jg" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.116572 4962 scope.go:117] "RemoveContainer" containerID="7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.145512 4962 scope.go:117] "RemoveContainer" containerID="14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.186409 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b56jg"] Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.196579 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b56jg"] Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.197230 4962 scope.go:117] "RemoveContainer" containerID="411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.241989 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" path="/var/lib/kubelet/pods/1f9fe961-45db-48ac-a72e-33146c0079dd/volumes" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.254543 4962 scope.go:117] "RemoveContainer" containerID="7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9" Oct 03 14:56:06 crc kubenswrapper[4962]: E1003 14:56:06.255189 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9\": container with ID starting with 7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9 not found: ID does not exist" containerID="7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.255302 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9"} err="failed to get container status \"7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9\": rpc error: code = NotFound desc = could not find container \"7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9\": container with ID starting with 7841589ad43efafc83e82193b746032ff2eeded18b80b2a7a9da4991307863c9 not found: ID does not exist" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.255386 4962 scope.go:117] "RemoveContainer" containerID="14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8" Oct 03 14:56:06 crc kubenswrapper[4962]: E1003 14:56:06.255873 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8\": container with ID starting with 14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8 not found: ID does not exist" containerID="14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.255969 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8"} err="failed to get container status \"14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8\": rpc error: code = NotFound desc = could not find container \"14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8\": container with ID starting with 14cb78cdb3d94c50aab9f68a454edde719742640eab4d9fcbc23c6412a0ff3d8 not found: ID does not exist" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.256043 4962 scope.go:117] "RemoveContainer" containerID="411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2" Oct 03 14:56:06 crc kubenswrapper[4962]: E1003 14:56:06.256861 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2\": container with ID starting with 411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2 not found: ID does not exist" containerID="411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2" Oct 03 14:56:06 crc kubenswrapper[4962]: I1003 14:56:06.256889 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2"} err="failed to get container status \"411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2\": rpc error: code = NotFound desc = could not find container \"411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2\": container with ID starting with 411104a7d3072d1d614cad0a94c4d328e4c425a420fa53e05321676524fab7f2 not found: ID does not exist" Oct 03 14:56:54 crc kubenswrapper[4962]: I1003 14:56:54.660616 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:56:54 crc kubenswrapper[4962]: I1003 14:56:54.661390 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:57:19 crc kubenswrapper[4962]: I1003 14:57:19.916494 4962 generic.go:334] "Generic (PLEG): container finished" podID="1ea37c03-872a-453b-8f55-42d377fa11ad" containerID="2041378a49810d5b33e496a0479281b2879e31b50086a1e066e7baa2551ccf8b" exitCode=0 Oct 03 14:57:19 crc kubenswrapper[4962]: I1003 14:57:19.916559 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" event={"ID":"1ea37c03-872a-453b-8f55-42d377fa11ad","Type":"ContainerDied","Data":"2041378a49810d5b33e496a0479281b2879e31b50086a1e066e7baa2551ccf8b"} Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.483438 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.630205 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z9wj\" (UniqueName: \"kubernetes.io/projected/1ea37c03-872a-453b-8f55-42d377fa11ad-kube-api-access-7z9wj\") pod \"1ea37c03-872a-453b-8f55-42d377fa11ad\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.630264 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ceph\") pod \"1ea37c03-872a-453b-8f55-42d377fa11ad\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.630411 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-inventory\") pod \"1ea37c03-872a-453b-8f55-42d377fa11ad\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.630589 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ssh-key\") pod \"1ea37c03-872a-453b-8f55-42d377fa11ad\" (UID: \"1ea37c03-872a-453b-8f55-42d377fa11ad\") " Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.637006 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ceph" (OuterVolumeSpecName: "ceph") pod "1ea37c03-872a-453b-8f55-42d377fa11ad" (UID: "1ea37c03-872a-453b-8f55-42d377fa11ad"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.637153 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea37c03-872a-453b-8f55-42d377fa11ad-kube-api-access-7z9wj" (OuterVolumeSpecName: "kube-api-access-7z9wj") pod "1ea37c03-872a-453b-8f55-42d377fa11ad" (UID: "1ea37c03-872a-453b-8f55-42d377fa11ad"). InnerVolumeSpecName "kube-api-access-7z9wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.664944 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ea37c03-872a-453b-8f55-42d377fa11ad" (UID: "1ea37c03-872a-453b-8f55-42d377fa11ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.673217 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-inventory" (OuterVolumeSpecName: "inventory") pod "1ea37c03-872a-453b-8f55-42d377fa11ad" (UID: "1ea37c03-872a-453b-8f55-42d377fa11ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.734209 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.734399 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z9wj\" (UniqueName: \"kubernetes.io/projected/1ea37c03-872a-453b-8f55-42d377fa11ad-kube-api-access-7z9wj\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.734490 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.734674 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea37c03-872a-453b-8f55-42d377fa11ad-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.934846 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" event={"ID":"1ea37c03-872a-453b-8f55-42d377fa11ad","Type":"ContainerDied","Data":"4e3fe90cb69aa7815e0533b47931a9346bfb4570f96938dce396cb97b9ae2bb9"} Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.934884 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-kv9sn" Oct 03 14:57:21 crc kubenswrapper[4962]: I1003 14:57:21.934897 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e3fe90cb69aa7815e0533b47931a9346bfb4570f96938dce396cb97b9ae2bb9" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.029650 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-xf5xb"] Oct 03 14:57:22 crc kubenswrapper[4962]: E1003 14:57:22.030051 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerName="extract-utilities" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030067 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerName="extract-utilities" Oct 03 14:57:22 crc kubenswrapper[4962]: E1003 14:57:22.030090 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerName="registry-server" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030095 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerName="registry-server" Oct 03 14:57:22 crc kubenswrapper[4962]: E1003 14:57:22.030109 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerName="extract-content" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030115 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerName="extract-content" Oct 03 14:57:22 crc kubenswrapper[4962]: E1003 14:57:22.030123 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerName="extract-content" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030128 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerName="extract-content" Oct 03 14:57:22 crc kubenswrapper[4962]: E1003 14:57:22.030150 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerName="extract-utilities" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030156 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerName="extract-utilities" Oct 03 14:57:22 crc kubenswrapper[4962]: E1003 14:57:22.030173 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerName="registry-server" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030178 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerName="registry-server" Oct 03 14:57:22 crc kubenswrapper[4962]: E1003 14:57:22.030194 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea37c03-872a-453b-8f55-42d377fa11ad" containerName="download-cache-openstack-openstack-cell1" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030199 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea37c03-872a-453b-8f55-42d377fa11ad" containerName="download-cache-openstack-openstack-cell1" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030391 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f131e30-c2c9-411c-9bff-620970ba0a23" containerName="registry-server" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030406 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ea37c03-872a-453b-8f55-42d377fa11ad" containerName="download-cache-openstack-openstack-cell1" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.030419 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9fe961-45db-48ac-a72e-33146c0079dd" containerName="registry-server" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.031322 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.033199 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.033386 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.033668 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.033917 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.050737 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-xf5xb"] Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.142581 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ceph\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.142655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ssh-key\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.142816 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-inventory\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.142970 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpmd\" (UniqueName: \"kubernetes.io/projected/3fe030b0-f58c-4b14-b76f-93c6724a902b-kube-api-access-whpmd\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.244481 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ceph\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.244555 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ssh-key\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.244605 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-inventory\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.244680 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpmd\" (UniqueName: \"kubernetes.io/projected/3fe030b0-f58c-4b14-b76f-93c6724a902b-kube-api-access-whpmd\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.254477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ssh-key\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.254515 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ceph\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.259274 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-inventory\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.260545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpmd\" (UniqueName: \"kubernetes.io/projected/3fe030b0-f58c-4b14-b76f-93c6724a902b-kube-api-access-whpmd\") pod \"configure-network-openstack-openstack-cell1-xf5xb\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.347096 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.920943 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-xf5xb"] Oct 03 14:57:22 crc kubenswrapper[4962]: I1003 14:57:22.946348 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" event={"ID":"3fe030b0-f58c-4b14-b76f-93c6724a902b","Type":"ContainerStarted","Data":"49c7f7b91858b0f8b2c3c1580291987cb637e784d4ad01964ee104c46eb93c75"} Oct 03 14:57:23 crc kubenswrapper[4962]: I1003 14:57:23.956700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" event={"ID":"3fe030b0-f58c-4b14-b76f-93c6724a902b","Type":"ContainerStarted","Data":"b811696b16ed447817447455604066465d44222a38f7405fa2a50d5182cc4428"} Oct 03 14:57:24 crc kubenswrapper[4962]: I1003 14:57:24.659873 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:57:24 crc kubenswrapper[4962]: I1003 14:57:24.659970 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:57:54 crc kubenswrapper[4962]: I1003 14:57:54.659743 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:57:54 crc kubenswrapper[4962]: I1003 14:57:54.660394 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:57:54 crc kubenswrapper[4962]: I1003 14:57:54.660514 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 14:57:54 crc kubenswrapper[4962]: I1003 14:57:54.661502 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a92d28487ead87bf551d56d7eaa01560a202df6439181d7e00d2ba06e2d0351"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:57:54 crc kubenswrapper[4962]: I1003 14:57:54.661582 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://4a92d28487ead87bf551d56d7eaa01560a202df6439181d7e00d2ba06e2d0351" gracePeriod=600 Oct 03 14:57:55 crc kubenswrapper[4962]: I1003 14:57:55.300772 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="4a92d28487ead87bf551d56d7eaa01560a202df6439181d7e00d2ba06e2d0351" exitCode=0 Oct 03 14:57:55 crc kubenswrapper[4962]: I1003 14:57:55.300853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"4a92d28487ead87bf551d56d7eaa01560a202df6439181d7e00d2ba06e2d0351"} Oct 03 14:57:55 crc kubenswrapper[4962]: I1003 14:57:55.301754 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330"} Oct 03 14:57:55 crc kubenswrapper[4962]: I1003 14:57:55.301792 4962 scope.go:117] "RemoveContainer" containerID="4a67ceef9e53181de3d922469b2925159c518928ed34b001eb82ca061d24874e" Oct 03 14:57:55 crc kubenswrapper[4962]: I1003 14:57:55.322731 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" podStartSLOduration=33.141529798 podStartE2EDuration="33.32271374s" podCreationTimestamp="2025-10-03 14:57:22 +0000 UTC" firstStartedPulling="2025-10-03 14:57:22.927479658 +0000 UTC m=+7651.331377503" lastFinishedPulling="2025-10-03 14:57:23.10866361 +0000 UTC m=+7651.512561445" observedRunningTime="2025-10-03 14:57:23.972588382 +0000 UTC m=+7652.376486237" watchObservedRunningTime="2025-10-03 14:57:55.32271374 +0000 UTC m=+7683.726611575" Oct 03 14:58:42 crc kubenswrapper[4962]: I1003 14:58:42.771495 4962 generic.go:334] "Generic (PLEG): container finished" podID="3fe030b0-f58c-4b14-b76f-93c6724a902b" containerID="b811696b16ed447817447455604066465d44222a38f7405fa2a50d5182cc4428" exitCode=0 Oct 03 14:58:42 crc kubenswrapper[4962]: I1003 14:58:42.772507 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" event={"ID":"3fe030b0-f58c-4b14-b76f-93c6724a902b","Type":"ContainerDied","Data":"b811696b16ed447817447455604066465d44222a38f7405fa2a50d5182cc4428"} Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.320685 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.444451 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whpmd\" (UniqueName: \"kubernetes.io/projected/3fe030b0-f58c-4b14-b76f-93c6724a902b-kube-api-access-whpmd\") pod \"3fe030b0-f58c-4b14-b76f-93c6724a902b\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.444535 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ceph\") pod \"3fe030b0-f58c-4b14-b76f-93c6724a902b\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.444677 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ssh-key\") pod \"3fe030b0-f58c-4b14-b76f-93c6724a902b\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.444740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-inventory\") pod \"3fe030b0-f58c-4b14-b76f-93c6724a902b\" (UID: \"3fe030b0-f58c-4b14-b76f-93c6724a902b\") " Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.451872 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ceph" (OuterVolumeSpecName: "ceph") pod "3fe030b0-f58c-4b14-b76f-93c6724a902b" (UID: "3fe030b0-f58c-4b14-b76f-93c6724a902b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.453446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe030b0-f58c-4b14-b76f-93c6724a902b-kube-api-access-whpmd" (OuterVolumeSpecName: "kube-api-access-whpmd") pod "3fe030b0-f58c-4b14-b76f-93c6724a902b" (UID: "3fe030b0-f58c-4b14-b76f-93c6724a902b"). InnerVolumeSpecName "kube-api-access-whpmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.479994 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3fe030b0-f58c-4b14-b76f-93c6724a902b" (UID: "3fe030b0-f58c-4b14-b76f-93c6724a902b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.480080 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-inventory" (OuterVolumeSpecName: "inventory") pod "3fe030b0-f58c-4b14-b76f-93c6724a902b" (UID: "3fe030b0-f58c-4b14-b76f-93c6724a902b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.547667 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.547702 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.547712 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whpmd\" (UniqueName: \"kubernetes.io/projected/3fe030b0-f58c-4b14-b76f-93c6724a902b-kube-api-access-whpmd\") on node \"crc\" DevicePath \"\"" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.547722 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fe030b0-f58c-4b14-b76f-93c6724a902b-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.797893 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" event={"ID":"3fe030b0-f58c-4b14-b76f-93c6724a902b","Type":"ContainerDied","Data":"49c7f7b91858b0f8b2c3c1580291987cb637e784d4ad01964ee104c46eb93c75"} Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.798226 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49c7f7b91858b0f8b2c3c1580291987cb637e784d4ad01964ee104c46eb93c75" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.798089 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-xf5xb" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.885319 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-6668s"] Oct 03 14:58:44 crc kubenswrapper[4962]: E1003 14:58:44.885829 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe030b0-f58c-4b14-b76f-93c6724a902b" containerName="configure-network-openstack-openstack-cell1" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.885850 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe030b0-f58c-4b14-b76f-93c6724a902b" containerName="configure-network-openstack-openstack-cell1" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.886055 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fe030b0-f58c-4b14-b76f-93c6724a902b" containerName="configure-network-openstack-openstack-cell1" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.886915 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.892086 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.892185 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.892256 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.892964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 14:58:44 crc kubenswrapper[4962]: I1003 14:58:44.911379 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-6668s"] Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.057111 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-inventory\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.057250 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xnd\" (UniqueName: \"kubernetes.io/projected/ae303f19-3403-4fc8-93fd-fcb48745e42f-kube-api-access-82xnd\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.057718 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ceph\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.058145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.160193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-inventory\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.160269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xnd\" (UniqueName: \"kubernetes.io/projected/ae303f19-3403-4fc8-93fd-fcb48745e42f-kube-api-access-82xnd\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.160322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ceph\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.160383 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.169119 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ceph\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.169208 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.171314 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-inventory\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.180502 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xnd\" (UniqueName: \"kubernetes.io/projected/ae303f19-3403-4fc8-93fd-fcb48745e42f-kube-api-access-82xnd\") pod \"validate-network-openstack-openstack-cell1-6668s\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.205385 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.771408 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-6668s"] Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.781026 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:58:45 crc kubenswrapper[4962]: I1003 14:58:45.810572 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-6668s" event={"ID":"ae303f19-3403-4fc8-93fd-fcb48745e42f","Type":"ContainerStarted","Data":"c2099d16a65964b0a3e7809c30bfffa64b961156a46ac770988b06938ff87e09"} Oct 03 14:58:46 crc kubenswrapper[4962]: I1003 14:58:46.820887 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-6668s" event={"ID":"ae303f19-3403-4fc8-93fd-fcb48745e42f","Type":"ContainerStarted","Data":"2361374ef9b9b31e11c555c2fa566522cf91608eee930fee0a37a5254e51ec05"} Oct 03 14:58:46 crc kubenswrapper[4962]: I1003 14:58:46.846754 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-6668s" podStartSLOduration=2.689572638 podStartE2EDuration="2.84672177s" podCreationTimestamp="2025-10-03 14:58:44 +0000 UTC" firstStartedPulling="2025-10-03 14:58:45.780819111 +0000 UTC m=+7734.184716936" lastFinishedPulling="2025-10-03 14:58:45.937968233 +0000 UTC m=+7734.341866068" observedRunningTime="2025-10-03 14:58:46.835382698 +0000 UTC m=+7735.239280543" watchObservedRunningTime="2025-10-03 14:58:46.84672177 +0000 UTC m=+7735.250619615" Oct 03 14:58:50 crc kubenswrapper[4962]: I1003 14:58:50.855909 4962 generic.go:334] "Generic (PLEG): container finished" podID="ae303f19-3403-4fc8-93fd-fcb48745e42f" containerID="2361374ef9b9b31e11c555c2fa566522cf91608eee930fee0a37a5254e51ec05" exitCode=0 Oct 03 14:58:50 crc kubenswrapper[4962]: I1003 14:58:50.856051 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-6668s" event={"ID":"ae303f19-3403-4fc8-93fd-fcb48745e42f","Type":"ContainerDied","Data":"2361374ef9b9b31e11c555c2fa566522cf91608eee930fee0a37a5254e51ec05"} Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.318569 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.415348 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82xnd\" (UniqueName: \"kubernetes.io/projected/ae303f19-3403-4fc8-93fd-fcb48745e42f-kube-api-access-82xnd\") pod \"ae303f19-3403-4fc8-93fd-fcb48745e42f\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.415574 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ceph\") pod \"ae303f19-3403-4fc8-93fd-fcb48745e42f\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.415625 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ssh-key\") pod \"ae303f19-3403-4fc8-93fd-fcb48745e42f\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.415802 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-inventory\") pod \"ae303f19-3403-4fc8-93fd-fcb48745e42f\" (UID: \"ae303f19-3403-4fc8-93fd-fcb48745e42f\") " Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.421017 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae303f19-3403-4fc8-93fd-fcb48745e42f-kube-api-access-82xnd" (OuterVolumeSpecName: "kube-api-access-82xnd") pod "ae303f19-3403-4fc8-93fd-fcb48745e42f" (UID: "ae303f19-3403-4fc8-93fd-fcb48745e42f"). InnerVolumeSpecName "kube-api-access-82xnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.422533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ceph" (OuterVolumeSpecName: "ceph") pod "ae303f19-3403-4fc8-93fd-fcb48745e42f" (UID: "ae303f19-3403-4fc8-93fd-fcb48745e42f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.449615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-inventory" (OuterVolumeSpecName: "inventory") pod "ae303f19-3403-4fc8-93fd-fcb48745e42f" (UID: "ae303f19-3403-4fc8-93fd-fcb48745e42f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.450866 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ae303f19-3403-4fc8-93fd-fcb48745e42f" (UID: "ae303f19-3403-4fc8-93fd-fcb48745e42f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.518257 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82xnd\" (UniqueName: \"kubernetes.io/projected/ae303f19-3403-4fc8-93fd-fcb48745e42f-kube-api-access-82xnd\") on node \"crc\" DevicePath \"\"" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.518331 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.518353 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.518361 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae303f19-3403-4fc8-93fd-fcb48745e42f-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.874957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-6668s" event={"ID":"ae303f19-3403-4fc8-93fd-fcb48745e42f","Type":"ContainerDied","Data":"c2099d16a65964b0a3e7809c30bfffa64b961156a46ac770988b06938ff87e09"} Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.875416 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2099d16a65964b0a3e7809c30bfffa64b961156a46ac770988b06938ff87e09" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.875037 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-6668s" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.938097 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6svh8"] Oct 03 14:58:52 crc kubenswrapper[4962]: E1003 14:58:52.938497 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae303f19-3403-4fc8-93fd-fcb48745e42f" containerName="validate-network-openstack-openstack-cell1" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.938514 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae303f19-3403-4fc8-93fd-fcb48745e42f" containerName="validate-network-openstack-openstack-cell1" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.938759 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae303f19-3403-4fc8-93fd-fcb48745e42f" containerName="validate-network-openstack-openstack-cell1" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.939488 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.942762 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.942875 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.942970 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.942773 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 14:58:52 crc kubenswrapper[4962]: I1003 14:58:52.947192 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6svh8"] Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.026869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9g2\" (UniqueName: \"kubernetes.io/projected/025f3d9b-acea-44a1-9923-c0a471aedba2-kube-api-access-sc9g2\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.026925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-inventory\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.026957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ceph\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.027036 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ssh-key\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.128966 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-inventory\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.129024 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ceph\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.129133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ssh-key\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.129258 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9g2\" (UniqueName: \"kubernetes.io/projected/025f3d9b-acea-44a1-9923-c0a471aedba2-kube-api-access-sc9g2\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.133710 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ceph\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.134034 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-inventory\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.135781 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ssh-key\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.145845 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc9g2\" (UniqueName: \"kubernetes.io/projected/025f3d9b-acea-44a1-9923-c0a471aedba2-kube-api-access-sc9g2\") pod \"install-os-openstack-openstack-cell1-6svh8\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.255118 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.822688 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6svh8"] Oct 03 14:58:53 crc kubenswrapper[4962]: I1003 14:58:53.889353 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6svh8" event={"ID":"025f3d9b-acea-44a1-9923-c0a471aedba2","Type":"ContainerStarted","Data":"ee64731c8c460b9a46ca0268feaf842ac9826fba55c7e75cf8239536740f652f"} Oct 03 14:58:54 crc kubenswrapper[4962]: I1003 14:58:54.899755 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6svh8" event={"ID":"025f3d9b-acea-44a1-9923-c0a471aedba2","Type":"ContainerStarted","Data":"515b172eeccc14fdaca25a46cfb4112c3cdecb2a30a01418665a0b809166e5de"} Oct 03 14:58:54 crc kubenswrapper[4962]: I1003 14:58:54.960844 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-6svh8" podStartSLOduration=2.817610339 podStartE2EDuration="2.960823899s" podCreationTimestamp="2025-10-03 14:58:52 +0000 UTC" firstStartedPulling="2025-10-03 14:58:53.8289136 +0000 UTC m=+7742.232811435" lastFinishedPulling="2025-10-03 14:58:53.97212716 +0000 UTC m=+7742.376024995" observedRunningTime="2025-10-03 14:58:54.927418228 +0000 UTC m=+7743.331316083" watchObservedRunningTime="2025-10-03 14:58:54.960823899 +0000 UTC m=+7743.364721734" Oct 03 14:59:36 crc kubenswrapper[4962]: I1003 14:59:36.293701 4962 generic.go:334] "Generic (PLEG): container finished" podID="025f3d9b-acea-44a1-9923-c0a471aedba2" containerID="515b172eeccc14fdaca25a46cfb4112c3cdecb2a30a01418665a0b809166e5de" exitCode=0 Oct 03 14:59:36 crc kubenswrapper[4962]: I1003 14:59:36.293816 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6svh8" event={"ID":"025f3d9b-acea-44a1-9923-c0a471aedba2","Type":"ContainerDied","Data":"515b172eeccc14fdaca25a46cfb4112c3cdecb2a30a01418665a0b809166e5de"} Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.758392 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.803628 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ceph\") pod \"025f3d9b-acea-44a1-9923-c0a471aedba2\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.803805 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-inventory\") pod \"025f3d9b-acea-44a1-9923-c0a471aedba2\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.804003 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc9g2\" (UniqueName: \"kubernetes.io/projected/025f3d9b-acea-44a1-9923-c0a471aedba2-kube-api-access-sc9g2\") pod \"025f3d9b-acea-44a1-9923-c0a471aedba2\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.804061 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ssh-key\") pod \"025f3d9b-acea-44a1-9923-c0a471aedba2\" (UID: \"025f3d9b-acea-44a1-9923-c0a471aedba2\") " Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.809401 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ceph" (OuterVolumeSpecName: "ceph") pod "025f3d9b-acea-44a1-9923-c0a471aedba2" (UID: "025f3d9b-acea-44a1-9923-c0a471aedba2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.810574 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025f3d9b-acea-44a1-9923-c0a471aedba2-kube-api-access-sc9g2" (OuterVolumeSpecName: "kube-api-access-sc9g2") pod "025f3d9b-acea-44a1-9923-c0a471aedba2" (UID: "025f3d9b-acea-44a1-9923-c0a471aedba2"). InnerVolumeSpecName "kube-api-access-sc9g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.836199 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-inventory" (OuterVolumeSpecName: "inventory") pod "025f3d9b-acea-44a1-9923-c0a471aedba2" (UID: "025f3d9b-acea-44a1-9923-c0a471aedba2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.838483 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "025f3d9b-acea-44a1-9923-c0a471aedba2" (UID: "025f3d9b-acea-44a1-9923-c0a471aedba2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.907545 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc9g2\" (UniqueName: \"kubernetes.io/projected/025f3d9b-acea-44a1-9923-c0a471aedba2-kube-api-access-sc9g2\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.907577 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.907587 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:37 crc kubenswrapper[4962]: I1003 14:59:37.907595 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/025f3d9b-acea-44a1-9923-c0a471aedba2-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.318500 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6svh8" event={"ID":"025f3d9b-acea-44a1-9923-c0a471aedba2","Type":"ContainerDied","Data":"ee64731c8c460b9a46ca0268feaf842ac9826fba55c7e75cf8239536740f652f"} Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.318849 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee64731c8c460b9a46ca0268feaf842ac9826fba55c7e75cf8239536740f652f" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.318563 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6svh8" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.396316 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-ftl9x"] Oct 03 14:59:38 crc kubenswrapper[4962]: E1003 14:59:38.396826 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025f3d9b-acea-44a1-9923-c0a471aedba2" containerName="install-os-openstack-openstack-cell1" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.396846 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="025f3d9b-acea-44a1-9923-c0a471aedba2" containerName="install-os-openstack-openstack-cell1" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.397063 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="025f3d9b-acea-44a1-9923-c0a471aedba2" containerName="install-os-openstack-openstack-cell1" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.398125 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.400861 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.401536 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.401694 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.411726 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-ftl9x"] Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.414699 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.415497 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7rlf\" (UniqueName: \"kubernetes.io/projected/bdffe428-c92a-4343-9f35-fd522846891a-kube-api-access-c7rlf\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.415727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ceph\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.415867 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ssh-key\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.416122 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-inventory\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.518077 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7rlf\" (UniqueName: \"kubernetes.io/projected/bdffe428-c92a-4343-9f35-fd522846891a-kube-api-access-c7rlf\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.518153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ceph\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.518220 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ssh-key\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.518333 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-inventory\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.524169 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ssh-key\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.526951 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ceph\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.536186 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7rlf\" (UniqueName: \"kubernetes.io/projected/bdffe428-c92a-4343-9f35-fd522846891a-kube-api-access-c7rlf\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.542768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-inventory\") pod \"configure-os-openstack-openstack-cell1-ftl9x\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:38 crc kubenswrapper[4962]: I1003 14:59:38.776698 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 14:59:39 crc kubenswrapper[4962]: I1003 14:59:39.351525 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-ftl9x"] Oct 03 14:59:40 crc kubenswrapper[4962]: I1003 14:59:40.339401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" event={"ID":"bdffe428-c92a-4343-9f35-fd522846891a","Type":"ContainerStarted","Data":"7b54ccc6db7a824fc47d12af1660a3cb066a04b9281f11d632dee162a4396fbc"} Oct 03 14:59:40 crc kubenswrapper[4962]: I1003 14:59:40.340676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" event={"ID":"bdffe428-c92a-4343-9f35-fd522846891a","Type":"ContainerStarted","Data":"97c5049f474647925ab4a790e086e624bc46cf0ba97c3fdefb588333a970824f"} Oct 03 14:59:40 crc kubenswrapper[4962]: I1003 14:59:40.364903 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" podStartSLOduration=2.191281576 podStartE2EDuration="2.364888707s" podCreationTimestamp="2025-10-03 14:59:38 +0000 UTC" firstStartedPulling="2025-10-03 14:59:39.349083904 +0000 UTC m=+7787.752981739" lastFinishedPulling="2025-10-03 14:59:39.522691035 +0000 UTC m=+7787.926588870" observedRunningTime="2025-10-03 14:59:40.355665211 +0000 UTC m=+7788.759563076" watchObservedRunningTime="2025-10-03 14:59:40.364888707 +0000 UTC m=+7788.768786542" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.153823 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4bdnx"] Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.157168 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.167008 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4bdnx"] Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.241570 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-utilities\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.241600 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-catalog-content\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.241654 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq7gn\" (UniqueName: \"kubernetes.io/projected/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-kube-api-access-mq7gn\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.343191 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-utilities\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.343229 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-catalog-content\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.343264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq7gn\" (UniqueName: \"kubernetes.io/projected/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-kube-api-access-mq7gn\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.344108 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-catalog-content\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.344306 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-utilities\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.363234 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq7gn\" (UniqueName: \"kubernetes.io/projected/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-kube-api-access-mq7gn\") pod \"redhat-operators-4bdnx\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.480625 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:49 crc kubenswrapper[4962]: I1003 14:59:49.997962 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4bdnx"] Oct 03 14:59:50 crc kubenswrapper[4962]: I1003 14:59:50.421260 4962 generic.go:334] "Generic (PLEG): container finished" podID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerID="1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9" exitCode=0 Oct 03 14:59:50 crc kubenswrapper[4962]: I1003 14:59:50.421314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bdnx" event={"ID":"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9","Type":"ContainerDied","Data":"1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9"} Oct 03 14:59:50 crc kubenswrapper[4962]: I1003 14:59:50.421575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bdnx" event={"ID":"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9","Type":"ContainerStarted","Data":"757d0f8088f4388ec68f1cc32315256d05775a6466bb90549d44ffd8634c90f6"} Oct 03 14:59:52 crc kubenswrapper[4962]: I1003 14:59:52.441388 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bdnx" event={"ID":"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9","Type":"ContainerStarted","Data":"c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35"} Oct 03 14:59:54 crc kubenswrapper[4962]: I1003 14:59:54.470183 4962 generic.go:334] "Generic (PLEG): container finished" podID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerID="c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35" exitCode=0 Oct 03 14:59:54 crc kubenswrapper[4962]: I1003 14:59:54.470583 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bdnx" event={"ID":"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9","Type":"ContainerDied","Data":"c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35"} Oct 03 14:59:55 crc kubenswrapper[4962]: I1003 14:59:55.481518 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bdnx" event={"ID":"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9","Type":"ContainerStarted","Data":"395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6"} Oct 03 14:59:55 crc kubenswrapper[4962]: I1003 14:59:55.508471 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4bdnx" podStartSLOduration=1.6682039629999998 podStartE2EDuration="6.508448367s" podCreationTimestamp="2025-10-03 14:59:49 +0000 UTC" firstStartedPulling="2025-10-03 14:59:50.423485207 +0000 UTC m=+7798.827383042" lastFinishedPulling="2025-10-03 14:59:55.263729611 +0000 UTC m=+7803.667627446" observedRunningTime="2025-10-03 14:59:55.498593044 +0000 UTC m=+7803.902490879" watchObservedRunningTime="2025-10-03 14:59:55.508448367 +0000 UTC m=+7803.912346192" Oct 03 14:59:59 crc kubenswrapper[4962]: I1003 14:59:59.482739 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 14:59:59 crc kubenswrapper[4962]: I1003 14:59:59.483340 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.134818 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj"] Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.137531 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.139545 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.140740 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.145931 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj"] Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.227249 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4488223b-0056-4042-b91b-56e1ac0c9283-config-volume\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.227305 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxl2v\" (UniqueName: \"kubernetes.io/projected/4488223b-0056-4042-b91b-56e1ac0c9283-kube-api-access-gxl2v\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.227420 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4488223b-0056-4042-b91b-56e1ac0c9283-secret-volume\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.328657 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4488223b-0056-4042-b91b-56e1ac0c9283-config-volume\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.328707 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxl2v\" (UniqueName: \"kubernetes.io/projected/4488223b-0056-4042-b91b-56e1ac0c9283-kube-api-access-gxl2v\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.328779 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4488223b-0056-4042-b91b-56e1ac0c9283-secret-volume\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.329557 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4488223b-0056-4042-b91b-56e1ac0c9283-config-volume\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.336213 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4488223b-0056-4042-b91b-56e1ac0c9283-secret-volume\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.346749 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxl2v\" (UniqueName: \"kubernetes.io/projected/4488223b-0056-4042-b91b-56e1ac0c9283-kube-api-access-gxl2v\") pod \"collect-profiles-29325060-q8nnj\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.462735 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.530171 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4bdnx" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="registry-server" probeResult="failure" output=< Oct 03 15:00:00 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Oct 03 15:00:00 crc kubenswrapper[4962]: > Oct 03 15:00:00 crc kubenswrapper[4962]: I1003 15:00:00.906378 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj"] Oct 03 15:00:00 crc kubenswrapper[4962]: W1003 15:00:00.910957 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4488223b_0056_4042_b91b_56e1ac0c9283.slice/crio-db6d661093c151eda433c95b5920d7dc05e1f4b9717e341424e7b27b0d51bb45 WatchSource:0}: Error finding container db6d661093c151eda433c95b5920d7dc05e1f4b9717e341424e7b27b0d51bb45: Status 404 returned error can't find the container with id db6d661093c151eda433c95b5920d7dc05e1f4b9717e341424e7b27b0d51bb45 Oct 03 15:00:01 crc kubenswrapper[4962]: I1003 15:00:01.539334 4962 generic.go:334] "Generic (PLEG): container finished" podID="4488223b-0056-4042-b91b-56e1ac0c9283" containerID="50cdcc3eac96e0d3b5b74eefb65fb214e942be2f4c80bdd3a5f54da5e7af5249" exitCode=0 Oct 03 15:00:01 crc kubenswrapper[4962]: I1003 15:00:01.539494 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" event={"ID":"4488223b-0056-4042-b91b-56e1ac0c9283","Type":"ContainerDied","Data":"50cdcc3eac96e0d3b5b74eefb65fb214e942be2f4c80bdd3a5f54da5e7af5249"} Oct 03 15:00:01 crc kubenswrapper[4962]: I1003 15:00:01.539661 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" event={"ID":"4488223b-0056-4042-b91b-56e1ac0c9283","Type":"ContainerStarted","Data":"db6d661093c151eda433c95b5920d7dc05e1f4b9717e341424e7b27b0d51bb45"} Oct 03 15:00:02 crc kubenswrapper[4962]: I1003 15:00:02.918960 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:02 crc kubenswrapper[4962]: I1003 15:00:02.997921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxl2v\" (UniqueName: \"kubernetes.io/projected/4488223b-0056-4042-b91b-56e1ac0c9283-kube-api-access-gxl2v\") pod \"4488223b-0056-4042-b91b-56e1ac0c9283\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " Oct 03 15:00:02 crc kubenswrapper[4962]: I1003 15:00:02.997993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4488223b-0056-4042-b91b-56e1ac0c9283-config-volume\") pod \"4488223b-0056-4042-b91b-56e1ac0c9283\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " Oct 03 15:00:02 crc kubenswrapper[4962]: I1003 15:00:02.998020 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4488223b-0056-4042-b91b-56e1ac0c9283-secret-volume\") pod \"4488223b-0056-4042-b91b-56e1ac0c9283\" (UID: \"4488223b-0056-4042-b91b-56e1ac0c9283\") " Oct 03 15:00:02 crc kubenswrapper[4962]: I1003 15:00:02.999018 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4488223b-0056-4042-b91b-56e1ac0c9283-config-volume" (OuterVolumeSpecName: "config-volume") pod "4488223b-0056-4042-b91b-56e1ac0c9283" (UID: "4488223b-0056-4042-b91b-56e1ac0c9283"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.004373 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4488223b-0056-4042-b91b-56e1ac0c9283-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4488223b-0056-4042-b91b-56e1ac0c9283" (UID: "4488223b-0056-4042-b91b-56e1ac0c9283"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.005039 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4488223b-0056-4042-b91b-56e1ac0c9283-kube-api-access-gxl2v" (OuterVolumeSpecName: "kube-api-access-gxl2v") pod "4488223b-0056-4042-b91b-56e1ac0c9283" (UID: "4488223b-0056-4042-b91b-56e1ac0c9283"). InnerVolumeSpecName "kube-api-access-gxl2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.100363 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxl2v\" (UniqueName: \"kubernetes.io/projected/4488223b-0056-4042-b91b-56e1ac0c9283-kube-api-access-gxl2v\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.100401 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4488223b-0056-4042-b91b-56e1ac0c9283-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.100410 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4488223b-0056-4042-b91b-56e1ac0c9283-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.557171 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" event={"ID":"4488223b-0056-4042-b91b-56e1ac0c9283","Type":"ContainerDied","Data":"db6d661093c151eda433c95b5920d7dc05e1f4b9717e341424e7b27b0d51bb45"} Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.557494 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db6d661093c151eda433c95b5920d7dc05e1f4b9717e341424e7b27b0d51bb45" Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.557253 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-q8nnj" Oct 03 15:00:03 crc kubenswrapper[4962]: I1003 15:00:03.990804 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq"] Oct 03 15:00:04 crc kubenswrapper[4962]: I1003 15:00:04.002627 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-phqfq"] Oct 03 15:00:04 crc kubenswrapper[4962]: I1003 15:00:04.242223 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808fc01e-573f-4a85-bae4-c1864fb86c1f" path="/var/lib/kubelet/pods/808fc01e-573f-4a85-bae4-c1864fb86c1f/volumes" Oct 03 15:00:09 crc kubenswrapper[4962]: I1003 15:00:09.524352 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 15:00:09 crc kubenswrapper[4962]: I1003 15:00:09.588704 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 15:00:09 crc kubenswrapper[4962]: I1003 15:00:09.763693 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4bdnx"] Oct 03 15:00:10 crc kubenswrapper[4962]: I1003 15:00:10.632875 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4bdnx" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="registry-server" containerID="cri-o://395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6" gracePeriod=2 Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.129443 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.175083 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq7gn\" (UniqueName: \"kubernetes.io/projected/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-kube-api-access-mq7gn\") pod \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.175157 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-catalog-content\") pod \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.175263 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-utilities\") pod \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\" (UID: \"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9\") " Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.176168 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-utilities" (OuterVolumeSpecName: "utilities") pod "d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" (UID: "d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.181047 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-kube-api-access-mq7gn" (OuterVolumeSpecName: "kube-api-access-mq7gn") pod "d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" (UID: "d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9"). InnerVolumeSpecName "kube-api-access-mq7gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.266801 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" (UID: "d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.278209 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq7gn\" (UniqueName: \"kubernetes.io/projected/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-kube-api-access-mq7gn\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.278241 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.278252 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.642824 4962 generic.go:334] "Generic (PLEG): container finished" podID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerID="395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6" exitCode=0 Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.642875 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bdnx" event={"ID":"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9","Type":"ContainerDied","Data":"395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6"} Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.642903 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bdnx" event={"ID":"d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9","Type":"ContainerDied","Data":"757d0f8088f4388ec68f1cc32315256d05775a6466bb90549d44ffd8634c90f6"} Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.642921 4962 scope.go:117] "RemoveContainer" containerID="395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.643069 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4bdnx" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.676545 4962 scope.go:117] "RemoveContainer" containerID="c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.683987 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4bdnx"] Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.695136 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4bdnx"] Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.702469 4962 scope.go:117] "RemoveContainer" containerID="1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.764681 4962 scope.go:117] "RemoveContainer" containerID="395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6" Oct 03 15:00:11 crc kubenswrapper[4962]: E1003 15:00:11.770377 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6\": container with ID starting with 395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6 not found: ID does not exist" containerID="395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.770436 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6"} err="failed to get container status \"395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6\": rpc error: code = NotFound desc = could not find container \"395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6\": container with ID starting with 395a619a7903afc4601ca409ba9db79a2035dfd4c2624a2ae5cb2e482654c4f6 not found: ID does not exist" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.770481 4962 scope.go:117] "RemoveContainer" containerID="c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35" Oct 03 15:00:11 crc kubenswrapper[4962]: E1003 15:00:11.770906 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35\": container with ID starting with c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35 not found: ID does not exist" containerID="c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.770968 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35"} err="failed to get container status \"c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35\": rpc error: code = NotFound desc = could not find container \"c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35\": container with ID starting with c8b26296764a8bc41af483596d180e2938ecb034c79dd9fe3dd3a29b8720ee35 not found: ID does not exist" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.771001 4962 scope.go:117] "RemoveContainer" containerID="1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9" Oct 03 15:00:11 crc kubenswrapper[4962]: E1003 15:00:11.771317 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9\": container with ID starting with 1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9 not found: ID does not exist" containerID="1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9" Oct 03 15:00:11 crc kubenswrapper[4962]: I1003 15:00:11.771352 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9"} err="failed to get container status \"1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9\": rpc error: code = NotFound desc = could not find container \"1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9\": container with ID starting with 1d1084054e5cb805f5b7ad0ffda29c9cfd21d030ef8213c23a5db3bbb1a6fda9 not found: ID does not exist" Oct 03 15:00:12 crc kubenswrapper[4962]: I1003 15:00:12.237742 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" path="/var/lib/kubelet/pods/d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9/volumes" Oct 03 15:00:14 crc kubenswrapper[4962]: I1003 15:00:14.139719 4962 scope.go:117] "RemoveContainer" containerID="d7d37d335d55989528a55713d86452cb79ca318050db3a8beb9285cf671ab367" Oct 03 15:00:21 crc kubenswrapper[4962]: I1003 15:00:21.737709 4962 generic.go:334] "Generic (PLEG): container finished" podID="bdffe428-c92a-4343-9f35-fd522846891a" containerID="7b54ccc6db7a824fc47d12af1660a3cb066a04b9281f11d632dee162a4396fbc" exitCode=0 Oct 03 15:00:21 crc kubenswrapper[4962]: I1003 15:00:21.737808 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" event={"ID":"bdffe428-c92a-4343-9f35-fd522846891a","Type":"ContainerDied","Data":"7b54ccc6db7a824fc47d12af1660a3cb066a04b9281f11d632dee162a4396fbc"} Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.210936 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.327295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ssh-key\") pod \"bdffe428-c92a-4343-9f35-fd522846891a\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.327394 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ceph\") pod \"bdffe428-c92a-4343-9f35-fd522846891a\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.327591 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7rlf\" (UniqueName: \"kubernetes.io/projected/bdffe428-c92a-4343-9f35-fd522846891a-kube-api-access-c7rlf\") pod \"bdffe428-c92a-4343-9f35-fd522846891a\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.327626 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-inventory\") pod \"bdffe428-c92a-4343-9f35-fd522846891a\" (UID: \"bdffe428-c92a-4343-9f35-fd522846891a\") " Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.335819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ceph" (OuterVolumeSpecName: "ceph") pod "bdffe428-c92a-4343-9f35-fd522846891a" (UID: "bdffe428-c92a-4343-9f35-fd522846891a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.337287 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdffe428-c92a-4343-9f35-fd522846891a-kube-api-access-c7rlf" (OuterVolumeSpecName: "kube-api-access-c7rlf") pod "bdffe428-c92a-4343-9f35-fd522846891a" (UID: "bdffe428-c92a-4343-9f35-fd522846891a"). InnerVolumeSpecName "kube-api-access-c7rlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.362836 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-inventory" (OuterVolumeSpecName: "inventory") pod "bdffe428-c92a-4343-9f35-fd522846891a" (UID: "bdffe428-c92a-4343-9f35-fd522846891a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.368243 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bdffe428-c92a-4343-9f35-fd522846891a" (UID: "bdffe428-c92a-4343-9f35-fd522846891a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.430404 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.430456 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.430467 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7rlf\" (UniqueName: \"kubernetes.io/projected/bdffe428-c92a-4343-9f35-fd522846891a-kube-api-access-c7rlf\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.430478 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdffe428-c92a-4343-9f35-fd522846891a-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.758604 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" event={"ID":"bdffe428-c92a-4343-9f35-fd522846891a","Type":"ContainerDied","Data":"97c5049f474647925ab4a790e086e624bc46cf0ba97c3fdefb588333a970824f"} Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.758660 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-ftl9x" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.758672 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c5049f474647925ab4a790e086e624bc46cf0ba97c3fdefb588333a970824f" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.825793 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-jm55c"] Oct 03 15:00:23 crc kubenswrapper[4962]: E1003 15:00:23.826306 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdffe428-c92a-4343-9f35-fd522846891a" containerName="configure-os-openstack-openstack-cell1" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.826327 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdffe428-c92a-4343-9f35-fd522846891a" containerName="configure-os-openstack-openstack-cell1" Oct 03 15:00:23 crc kubenswrapper[4962]: E1003 15:00:23.826356 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="extract-utilities" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.826365 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="extract-utilities" Oct 03 15:00:23 crc kubenswrapper[4962]: E1003 15:00:23.826389 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="extract-content" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.826394 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="extract-content" Oct 03 15:00:23 crc kubenswrapper[4962]: E1003 15:00:23.826413 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="registry-server" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.826419 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="registry-server" Oct 03 15:00:23 crc kubenswrapper[4962]: E1003 15:00:23.826429 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4488223b-0056-4042-b91b-56e1ac0c9283" containerName="collect-profiles" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.826435 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4488223b-0056-4042-b91b-56e1ac0c9283" containerName="collect-profiles" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.826627 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b80b1e-faaf-4f52-bfd0-b43c4408c6e9" containerName="registry-server" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.826747 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4488223b-0056-4042-b91b-56e1ac0c9283" containerName="collect-profiles" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.826757 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdffe428-c92a-4343-9f35-fd522846891a" containerName="configure-os-openstack-openstack-cell1" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.827487 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.832323 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.832958 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.839788 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.839870 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.849289 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-jm55c"] Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.938940 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ceph\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.939359 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.939391 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59gwv\" (UniqueName: \"kubernetes.io/projected/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-kube-api-access-59gwv\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:23 crc kubenswrapper[4962]: I1003 15:00:23.939423 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-inventory-0\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.042463 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.042541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59gwv\" (UniqueName: \"kubernetes.io/projected/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-kube-api-access-59gwv\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.042611 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-inventory-0\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.042868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ceph\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.047858 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.048976 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ceph\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.051139 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-inventory-0\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.059052 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59gwv\" (UniqueName: \"kubernetes.io/projected/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-kube-api-access-59gwv\") pod \"ssh-known-hosts-openstack-jm55c\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.142689 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.660256 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.660674 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.695688 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-jm55c"] Oct 03 15:00:24 crc kubenswrapper[4962]: I1003 15:00:24.772617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jm55c" event={"ID":"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15","Type":"ContainerStarted","Data":"a2e4c15884f60d012a1f144a9c7339480d0c7706f63ad564c29fc5c57e651d39"} Oct 03 15:00:25 crc kubenswrapper[4962]: I1003 15:00:25.784113 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jm55c" event={"ID":"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15","Type":"ContainerStarted","Data":"59d7824c0c3bfe03d06bd3af3dc1a1c8ae53ab968d9d4274afa3ac281168c4fb"} Oct 03 15:00:25 crc kubenswrapper[4962]: I1003 15:00:25.807059 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-jm55c" podStartSLOduration=2.633606826 podStartE2EDuration="2.807041331s" podCreationTimestamp="2025-10-03 15:00:23 +0000 UTC" firstStartedPulling="2025-10-03 15:00:24.731346772 +0000 UTC m=+7833.135244607" lastFinishedPulling="2025-10-03 15:00:24.904781277 +0000 UTC m=+7833.308679112" observedRunningTime="2025-10-03 15:00:25.798333859 +0000 UTC m=+7834.202231694" watchObservedRunningTime="2025-10-03 15:00:25.807041331 +0000 UTC m=+7834.210939166" Oct 03 15:00:33 crc kubenswrapper[4962]: I1003 15:00:33.865742 4962 generic.go:334] "Generic (PLEG): container finished" podID="bd52771d-d70b-4fe5-a1f2-b4347bbf5c15" containerID="59d7824c0c3bfe03d06bd3af3dc1a1c8ae53ab968d9d4274afa3ac281168c4fb" exitCode=0 Oct 03 15:00:33 crc kubenswrapper[4962]: I1003 15:00:33.865792 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jm55c" event={"ID":"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15","Type":"ContainerDied","Data":"59d7824c0c3bfe03d06bd3af3dc1a1c8ae53ab968d9d4274afa3ac281168c4fb"} Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.388297 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.495957 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-inventory-0\") pod \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.496322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ceph\") pod \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.496389 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ssh-key-openstack-cell1\") pod \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.496510 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59gwv\" (UniqueName: \"kubernetes.io/projected/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-kube-api-access-59gwv\") pod \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\" (UID: \"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15\") " Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.506510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ceph" (OuterVolumeSpecName: "ceph") pod "bd52771d-d70b-4fe5-a1f2-b4347bbf5c15" (UID: "bd52771d-d70b-4fe5-a1f2-b4347bbf5c15"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.507023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-kube-api-access-59gwv" (OuterVolumeSpecName: "kube-api-access-59gwv") pod "bd52771d-d70b-4fe5-a1f2-b4347bbf5c15" (UID: "bd52771d-d70b-4fe5-a1f2-b4347bbf5c15"). InnerVolumeSpecName "kube-api-access-59gwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.535325 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "bd52771d-d70b-4fe5-a1f2-b4347bbf5c15" (UID: "bd52771d-d70b-4fe5-a1f2-b4347bbf5c15"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.536501 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bd52771d-d70b-4fe5-a1f2-b4347bbf5c15" (UID: "bd52771d-d70b-4fe5-a1f2-b4347bbf5c15"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.599236 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.599269 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.599279 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59gwv\" (UniqueName: \"kubernetes.io/projected/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-kube-api-access-59gwv\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.599287 4962 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd52771d-d70b-4fe5-a1f2-b4347bbf5c15-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.887823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jm55c" event={"ID":"bd52771d-d70b-4fe5-a1f2-b4347bbf5c15","Type":"ContainerDied","Data":"a2e4c15884f60d012a1f144a9c7339480d0c7706f63ad564c29fc5c57e651d39"} Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.888234 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2e4c15884f60d012a1f144a9c7339480d0c7706f63ad564c29fc5c57e651d39" Oct 03 15:00:35 crc kubenswrapper[4962]: I1003 15:00:35.888299 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jm55c" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.010044 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-44p8j"] Oct 03 15:00:36 crc kubenswrapper[4962]: E1003 15:00:36.010567 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd52771d-d70b-4fe5-a1f2-b4347bbf5c15" containerName="ssh-known-hosts-openstack" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.010591 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd52771d-d70b-4fe5-a1f2-b4347bbf5c15" containerName="ssh-known-hosts-openstack" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.010859 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd52771d-d70b-4fe5-a1f2-b4347bbf5c15" containerName="ssh-known-hosts-openstack" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.011789 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.014590 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.014588 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.015436 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.015770 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.025988 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-44p8j"] Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.109022 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-inventory\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.109360 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ssh-key\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.109415 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqhc2\" (UniqueName: \"kubernetes.io/projected/7e32ac65-a775-42ea-9693-20cdd36084cd-kube-api-access-mqhc2\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.109582 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ceph\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.211658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ssh-key\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.211711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqhc2\" (UniqueName: \"kubernetes.io/projected/7e32ac65-a775-42ea-9693-20cdd36084cd-kube-api-access-mqhc2\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.211833 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ceph\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.211894 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-inventory\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.218316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ssh-key\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.219616 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-inventory\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.221208 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ceph\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.237716 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqhc2\" (UniqueName: \"kubernetes.io/projected/7e32ac65-a775-42ea-9693-20cdd36084cd-kube-api-access-mqhc2\") pod \"run-os-openstack-openstack-cell1-44p8j\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.338894 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.880281 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-44p8j"] Oct 03 15:00:36 crc kubenswrapper[4962]: I1003 15:00:36.898487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-44p8j" event={"ID":"7e32ac65-a775-42ea-9693-20cdd36084cd","Type":"ContainerStarted","Data":"4c3decb56c5df0a48eeec8546fc361f60dbdd67763c80923bf312e573633ec05"} Oct 03 15:00:37 crc kubenswrapper[4962]: I1003 15:00:37.908268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-44p8j" event={"ID":"7e32ac65-a775-42ea-9693-20cdd36084cd","Type":"ContainerStarted","Data":"ae6fa03b879e99e48b18a064aa230a16a86ea663d9a1e7995b6dba1c101e5845"} Oct 03 15:00:37 crc kubenswrapper[4962]: I1003 15:00:37.929238 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-44p8j" podStartSLOduration=2.766402275 podStartE2EDuration="2.929223288s" podCreationTimestamp="2025-10-03 15:00:35 +0000 UTC" firstStartedPulling="2025-10-03 15:00:36.892917569 +0000 UTC m=+7845.296815424" lastFinishedPulling="2025-10-03 15:00:37.055738602 +0000 UTC m=+7845.459636437" observedRunningTime="2025-10-03 15:00:37.920984019 +0000 UTC m=+7846.324881854" watchObservedRunningTime="2025-10-03 15:00:37.929223288 +0000 UTC m=+7846.333121123" Oct 03 15:00:44 crc kubenswrapper[4962]: I1003 15:00:44.974590 4962 generic.go:334] "Generic (PLEG): container finished" podID="7e32ac65-a775-42ea-9693-20cdd36084cd" containerID="ae6fa03b879e99e48b18a064aa230a16a86ea663d9a1e7995b6dba1c101e5845" exitCode=0 Oct 03 15:00:44 crc kubenswrapper[4962]: I1003 15:00:44.974718 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-44p8j" event={"ID":"7e32ac65-a775-42ea-9693-20cdd36084cd","Type":"ContainerDied","Data":"ae6fa03b879e99e48b18a064aa230a16a86ea663d9a1e7995b6dba1c101e5845"} Oct 03 15:00:46 crc kubenswrapper[4962]: I1003 15:00:46.994155 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-44p8j" event={"ID":"7e32ac65-a775-42ea-9693-20cdd36084cd","Type":"ContainerDied","Data":"4c3decb56c5df0a48eeec8546fc361f60dbdd67763c80923bf312e573633ec05"} Oct 03 15:00:46 crc kubenswrapper[4962]: I1003 15:00:46.994754 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c3decb56c5df0a48eeec8546fc361f60dbdd67763c80923bf312e573633ec05" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.042873 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.212951 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-inventory\") pod \"7e32ac65-a775-42ea-9693-20cdd36084cd\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.213483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ceph\") pod \"7e32ac65-a775-42ea-9693-20cdd36084cd\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.213599 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqhc2\" (UniqueName: \"kubernetes.io/projected/7e32ac65-a775-42ea-9693-20cdd36084cd-kube-api-access-mqhc2\") pod \"7e32ac65-a775-42ea-9693-20cdd36084cd\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.215863 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ssh-key\") pod \"7e32ac65-a775-42ea-9693-20cdd36084cd\" (UID: \"7e32ac65-a775-42ea-9693-20cdd36084cd\") " Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.218558 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e32ac65-a775-42ea-9693-20cdd36084cd-kube-api-access-mqhc2" (OuterVolumeSpecName: "kube-api-access-mqhc2") pod "7e32ac65-a775-42ea-9693-20cdd36084cd" (UID: "7e32ac65-a775-42ea-9693-20cdd36084cd"). InnerVolumeSpecName "kube-api-access-mqhc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.228269 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ceph" (OuterVolumeSpecName: "ceph") pod "7e32ac65-a775-42ea-9693-20cdd36084cd" (UID: "7e32ac65-a775-42ea-9693-20cdd36084cd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.249623 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-inventory" (OuterVolumeSpecName: "inventory") pod "7e32ac65-a775-42ea-9693-20cdd36084cd" (UID: "7e32ac65-a775-42ea-9693-20cdd36084cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.250700 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7e32ac65-a775-42ea-9693-20cdd36084cd" (UID: "7e32ac65-a775-42ea-9693-20cdd36084cd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.317987 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.318021 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.318057 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqhc2\" (UniqueName: \"kubernetes.io/projected/7e32ac65-a775-42ea-9693-20cdd36084cd-kube-api-access-mqhc2\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:47 crc kubenswrapper[4962]: I1003 15:00:47.318069 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e32ac65-a775-42ea-9693-20cdd36084cd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.001042 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-44p8j" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.125424 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-mvkcw"] Oct 03 15:00:48 crc kubenswrapper[4962]: E1003 15:00:48.125846 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e32ac65-a775-42ea-9693-20cdd36084cd" containerName="run-os-openstack-openstack-cell1" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.125861 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e32ac65-a775-42ea-9693-20cdd36084cd" containerName="run-os-openstack-openstack-cell1" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.126065 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e32ac65-a775-42ea-9693-20cdd36084cd" containerName="run-os-openstack-openstack-cell1" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.126757 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.129598 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.129695 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.129853 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.129904 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.143295 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-mvkcw"] Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.148184 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pffr\" (UniqueName: \"kubernetes.io/projected/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-kube-api-access-6pffr\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.148433 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-inventory\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.148480 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.148555 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ceph\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.251785 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pffr\" (UniqueName: \"kubernetes.io/projected/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-kube-api-access-6pffr\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.252096 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-inventory\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.252142 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.252242 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ceph\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.256976 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-inventory\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.269251 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.276987 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ceph\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.281391 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pffr\" (UniqueName: \"kubernetes.io/projected/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-kube-api-access-6pffr\") pod \"reboot-os-openstack-openstack-cell1-mvkcw\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.447471 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:00:48 crc kubenswrapper[4962]: I1003 15:00:48.991047 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-mvkcw"] Oct 03 15:00:49 crc kubenswrapper[4962]: I1003 15:00:49.010101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" event={"ID":"11c2a64a-42ac-44e7-a6f9-9af9798ce53b","Type":"ContainerStarted","Data":"ecb4209cdee56e2913ae85d21067c3f9ddb9961b190d28a81efff8c5087b7157"} Oct 03 15:00:50 crc kubenswrapper[4962]: I1003 15:00:50.019974 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" event={"ID":"11c2a64a-42ac-44e7-a6f9-9af9798ce53b","Type":"ContainerStarted","Data":"6bb3462ffa36a9a79c41ea5a69201ed9a868b973e5f5f8319e9cd7ad80908b5d"} Oct 03 15:00:50 crc kubenswrapper[4962]: I1003 15:00:50.051826 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" podStartSLOduration=1.904267151 podStartE2EDuration="2.051804696s" podCreationTimestamp="2025-10-03 15:00:48 +0000 UTC" firstStartedPulling="2025-10-03 15:00:49.001924965 +0000 UTC m=+7857.405822800" lastFinishedPulling="2025-10-03 15:00:49.14946251 +0000 UTC m=+7857.553360345" observedRunningTime="2025-10-03 15:00:50.042811516 +0000 UTC m=+7858.446709381" watchObservedRunningTime="2025-10-03 15:00:50.051804696 +0000 UTC m=+7858.455702531" Oct 03 15:00:54 crc kubenswrapper[4962]: I1003 15:00:54.659580 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:00:54 crc kubenswrapper[4962]: I1003 15:00:54.660249 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.138219 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29325061-brkz8"] Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.140834 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.154221 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325061-brkz8"] Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.243271 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-config-data\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.243347 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-fernet-keys\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.243409 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6bbp\" (UniqueName: \"kubernetes.io/projected/4ec33b45-7142-49fb-9f47-719b72891dc1-kube-api-access-h6bbp\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.243721 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-combined-ca-bundle\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.347289 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-combined-ca-bundle\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.347452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-config-data\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.347515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-fernet-keys\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.347606 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6bbp\" (UniqueName: \"kubernetes.io/projected/4ec33b45-7142-49fb-9f47-719b72891dc1-kube-api-access-h6bbp\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.353703 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-fernet-keys\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.353703 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-combined-ca-bundle\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.355162 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-config-data\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.368288 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6bbp\" (UniqueName: \"kubernetes.io/projected/4ec33b45-7142-49fb-9f47-719b72891dc1-kube-api-access-h6bbp\") pod \"keystone-cron-29325061-brkz8\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.467358 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:00 crc kubenswrapper[4962]: I1003 15:01:00.937707 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325061-brkz8"] Oct 03 15:01:00 crc kubenswrapper[4962]: W1003 15:01:00.944946 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ec33b45_7142_49fb_9f47_719b72891dc1.slice/crio-ce44e6901e4489e1862e1bb1b197ca9eb95016e4b11696f72ac34c65f4cfd8a2 WatchSource:0}: Error finding container ce44e6901e4489e1862e1bb1b197ca9eb95016e4b11696f72ac34c65f4cfd8a2: Status 404 returned error can't find the container with id ce44e6901e4489e1862e1bb1b197ca9eb95016e4b11696f72ac34c65f4cfd8a2 Oct 03 15:01:01 crc kubenswrapper[4962]: I1003 15:01:01.134600 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325061-brkz8" event={"ID":"4ec33b45-7142-49fb-9f47-719b72891dc1","Type":"ContainerStarted","Data":"ce44e6901e4489e1862e1bb1b197ca9eb95016e4b11696f72ac34c65f4cfd8a2"} Oct 03 15:01:02 crc kubenswrapper[4962]: I1003 15:01:02.151508 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325061-brkz8" event={"ID":"4ec33b45-7142-49fb-9f47-719b72891dc1","Type":"ContainerStarted","Data":"44d3b69636369f06c32e8de34b86d87c55cada1bb5ce514893f5272c8dfda319"} Oct 03 15:01:02 crc kubenswrapper[4962]: I1003 15:01:02.179372 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29325061-brkz8" podStartSLOduration=2.179351807 podStartE2EDuration="2.179351807s" podCreationTimestamp="2025-10-03 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:01:02.176584423 +0000 UTC m=+7870.580482258" watchObservedRunningTime="2025-10-03 15:01:02.179351807 +0000 UTC m=+7870.583249642" Oct 03 15:01:04 crc kubenswrapper[4962]: I1003 15:01:04.194614 4962 generic.go:334] "Generic (PLEG): container finished" podID="4ec33b45-7142-49fb-9f47-719b72891dc1" containerID="44d3b69636369f06c32e8de34b86d87c55cada1bb5ce514893f5272c8dfda319" exitCode=0 Oct 03 15:01:04 crc kubenswrapper[4962]: I1003 15:01:04.194716 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325061-brkz8" event={"ID":"4ec33b45-7142-49fb-9f47-719b72891dc1","Type":"ContainerDied","Data":"44d3b69636369f06c32e8de34b86d87c55cada1bb5ce514893f5272c8dfda319"} Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.590207 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.668057 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6bbp\" (UniqueName: \"kubernetes.io/projected/4ec33b45-7142-49fb-9f47-719b72891dc1-kube-api-access-h6bbp\") pod \"4ec33b45-7142-49fb-9f47-719b72891dc1\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.668262 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-fernet-keys\") pod \"4ec33b45-7142-49fb-9f47-719b72891dc1\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.668775 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-config-data\") pod \"4ec33b45-7142-49fb-9f47-719b72891dc1\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.669018 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-combined-ca-bundle\") pod \"4ec33b45-7142-49fb-9f47-719b72891dc1\" (UID: \"4ec33b45-7142-49fb-9f47-719b72891dc1\") " Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.674108 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec33b45-7142-49fb-9f47-719b72891dc1-kube-api-access-h6bbp" (OuterVolumeSpecName: "kube-api-access-h6bbp") pod "4ec33b45-7142-49fb-9f47-719b72891dc1" (UID: "4ec33b45-7142-49fb-9f47-719b72891dc1"). InnerVolumeSpecName "kube-api-access-h6bbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.674132 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4ec33b45-7142-49fb-9f47-719b72891dc1" (UID: "4ec33b45-7142-49fb-9f47-719b72891dc1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.712482 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ec33b45-7142-49fb-9f47-719b72891dc1" (UID: "4ec33b45-7142-49fb-9f47-719b72891dc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.720776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-config-data" (OuterVolumeSpecName: "config-data") pod "4ec33b45-7142-49fb-9f47-719b72891dc1" (UID: "4ec33b45-7142-49fb-9f47-719b72891dc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.771883 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.771922 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6bbp\" (UniqueName: \"kubernetes.io/projected/4ec33b45-7142-49fb-9f47-719b72891dc1-kube-api-access-h6bbp\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.771934 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:05 crc kubenswrapper[4962]: I1003 15:01:05.771942 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ec33b45-7142-49fb-9f47-719b72891dc1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:06 crc kubenswrapper[4962]: I1003 15:01:06.213583 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325061-brkz8" event={"ID":"4ec33b45-7142-49fb-9f47-719b72891dc1","Type":"ContainerDied","Data":"ce44e6901e4489e1862e1bb1b197ca9eb95016e4b11696f72ac34c65f4cfd8a2"} Oct 03 15:01:06 crc kubenswrapper[4962]: I1003 15:01:06.213617 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325061-brkz8" Oct 03 15:01:06 crc kubenswrapper[4962]: I1003 15:01:06.213623 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce44e6901e4489e1862e1bb1b197ca9eb95016e4b11696f72ac34c65f4cfd8a2" Oct 03 15:01:14 crc kubenswrapper[4962]: I1003 15:01:14.302270 4962 generic.go:334] "Generic (PLEG): container finished" podID="11c2a64a-42ac-44e7-a6f9-9af9798ce53b" containerID="6bb3462ffa36a9a79c41ea5a69201ed9a868b973e5f5f8319e9cd7ad80908b5d" exitCode=0 Oct 03 15:01:14 crc kubenswrapper[4962]: I1003 15:01:14.302998 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" event={"ID":"11c2a64a-42ac-44e7-a6f9-9af9798ce53b","Type":"ContainerDied","Data":"6bb3462ffa36a9a79c41ea5a69201ed9a868b973e5f5f8319e9cd7ad80908b5d"} Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.832855 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.906621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-inventory\") pod \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.906918 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ceph\") pod \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.907020 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ssh-key\") pod \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.907152 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pffr\" (UniqueName: \"kubernetes.io/projected/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-kube-api-access-6pffr\") pod \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\" (UID: \"11c2a64a-42ac-44e7-a6f9-9af9798ce53b\") " Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.914586 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-kube-api-access-6pffr" (OuterVolumeSpecName: "kube-api-access-6pffr") pod "11c2a64a-42ac-44e7-a6f9-9af9798ce53b" (UID: "11c2a64a-42ac-44e7-a6f9-9af9798ce53b"). InnerVolumeSpecName "kube-api-access-6pffr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.915096 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ceph" (OuterVolumeSpecName: "ceph") pod "11c2a64a-42ac-44e7-a6f9-9af9798ce53b" (UID: "11c2a64a-42ac-44e7-a6f9-9af9798ce53b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.945371 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-inventory" (OuterVolumeSpecName: "inventory") pod "11c2a64a-42ac-44e7-a6f9-9af9798ce53b" (UID: "11c2a64a-42ac-44e7-a6f9-9af9798ce53b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:15 crc kubenswrapper[4962]: I1003 15:01:15.970736 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "11c2a64a-42ac-44e7-a6f9-9af9798ce53b" (UID: "11c2a64a-42ac-44e7-a6f9-9af9798ce53b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.009692 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.009721 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.009730 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.009740 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pffr\" (UniqueName: \"kubernetes.io/projected/11c2a64a-42ac-44e7-a6f9-9af9798ce53b-kube-api-access-6pffr\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.331566 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" event={"ID":"11c2a64a-42ac-44e7-a6f9-9af9798ce53b","Type":"ContainerDied","Data":"ecb4209cdee56e2913ae85d21067c3f9ddb9961b190d28a81efff8c5087b7157"} Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.331613 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecb4209cdee56e2913ae85d21067c3f9ddb9961b190d28a81efff8c5087b7157" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.331763 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-mvkcw" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.427380 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-ls7cn"] Oct 03 15:01:16 crc kubenswrapper[4962]: E1003 15:01:16.428767 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c2a64a-42ac-44e7-a6f9-9af9798ce53b" containerName="reboot-os-openstack-openstack-cell1" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.428899 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c2a64a-42ac-44e7-a6f9-9af9798ce53b" containerName="reboot-os-openstack-openstack-cell1" Oct 03 15:01:16 crc kubenswrapper[4962]: E1003 15:01:16.429055 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec33b45-7142-49fb-9f47-719b72891dc1" containerName="keystone-cron" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.429174 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec33b45-7142-49fb-9f47-719b72891dc1" containerName="keystone-cron" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.429731 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec33b45-7142-49fb-9f47-719b72891dc1" containerName="keystone-cron" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.429885 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c2a64a-42ac-44e7-a6f9-9af9798ce53b" containerName="reboot-os-openstack-openstack-cell1" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.431746 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.434627 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.434868 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.434955 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.435083 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.440432 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-ls7cn"] Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-inventory\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521416 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521442 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521554 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521583 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521659 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ceph\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521713 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ssh-key\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnv57\" (UniqueName: \"kubernetes.io/projected/eb12d174-0920-408e-aa2f-ff2f07a1e005-kube-api-access-tnv57\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.521933 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-inventory\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624275 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624353 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624422 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624552 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ceph\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624723 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ssh-key\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624769 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624824 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnv57\" (UniqueName: \"kubernetes.io/projected/eb12d174-0920-408e-aa2f-ff2f07a1e005-kube-api-access-tnv57\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.624868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.629451 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.629544 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.629620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-inventory\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.630442 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.630452 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.630826 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.633840 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.633993 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ceph\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.640750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.640988 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.643020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ssh-key\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.646471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnv57\" (UniqueName: \"kubernetes.io/projected/eb12d174-0920-408e-aa2f-ff2f07a1e005-kube-api-access-tnv57\") pod \"install-certs-openstack-openstack-cell1-ls7cn\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:16 crc kubenswrapper[4962]: I1003 15:01:16.752961 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:17 crc kubenswrapper[4962]: I1003 15:01:17.274117 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-ls7cn"] Oct 03 15:01:17 crc kubenswrapper[4962]: I1003 15:01:17.340528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" event={"ID":"eb12d174-0920-408e-aa2f-ff2f07a1e005","Type":"ContainerStarted","Data":"568b83821c27b26323d74937b13668a5f34309e91053d134b045cf3ff2d983f3"} Oct 03 15:01:18 crc kubenswrapper[4962]: I1003 15:01:18.352526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" event={"ID":"eb12d174-0920-408e-aa2f-ff2f07a1e005","Type":"ContainerStarted","Data":"ff53b2005a27156b2d46670e75a2ed208ab4b13a8181756db71e7967b237b41e"} Oct 03 15:01:18 crc kubenswrapper[4962]: I1003 15:01:18.378142 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" podStartSLOduration=2.212310735 podStartE2EDuration="2.378121478s" podCreationTimestamp="2025-10-03 15:01:16 +0000 UTC" firstStartedPulling="2025-10-03 15:01:17.274692569 +0000 UTC m=+7885.678590414" lastFinishedPulling="2025-10-03 15:01:17.440503322 +0000 UTC m=+7885.844401157" observedRunningTime="2025-10-03 15:01:18.374319107 +0000 UTC m=+7886.778216962" watchObservedRunningTime="2025-10-03 15:01:18.378121478 +0000 UTC m=+7886.782019333" Oct 03 15:01:24 crc kubenswrapper[4962]: I1003 15:01:24.660085 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:01:24 crc kubenswrapper[4962]: I1003 15:01:24.660923 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:01:24 crc kubenswrapper[4962]: I1003 15:01:24.660980 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 15:01:24 crc kubenswrapper[4962]: I1003 15:01:24.661969 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:01:24 crc kubenswrapper[4962]: I1003 15:01:24.662029 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" gracePeriod=600 Oct 03 15:01:24 crc kubenswrapper[4962]: E1003 15:01:24.809194 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:01:25 crc kubenswrapper[4962]: I1003 15:01:25.417779 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" exitCode=0 Oct 03 15:01:25 crc kubenswrapper[4962]: I1003 15:01:25.417835 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330"} Oct 03 15:01:25 crc kubenswrapper[4962]: I1003 15:01:25.418022 4962 scope.go:117] "RemoveContainer" containerID="4a92d28487ead87bf551d56d7eaa01560a202df6439181d7e00d2ba06e2d0351" Oct 03 15:01:25 crc kubenswrapper[4962]: I1003 15:01:25.418946 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:01:25 crc kubenswrapper[4962]: E1003 15:01:25.419467 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:01:35 crc kubenswrapper[4962]: I1003 15:01:35.514169 4962 generic.go:334] "Generic (PLEG): container finished" podID="eb12d174-0920-408e-aa2f-ff2f07a1e005" containerID="ff53b2005a27156b2d46670e75a2ed208ab4b13a8181756db71e7967b237b41e" exitCode=0 Oct 03 15:01:35 crc kubenswrapper[4962]: I1003 15:01:35.514700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" event={"ID":"eb12d174-0920-408e-aa2f-ff2f07a1e005","Type":"ContainerDied","Data":"ff53b2005a27156b2d46670e75a2ed208ab4b13a8181756db71e7967b237b41e"} Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.944125 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.997757 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-libvirt-combined-ca-bundle\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.997846 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-telemetry-combined-ca-bundle\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.997872 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-metadata-combined-ca-bundle\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.997907 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ovn-combined-ca-bundle\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.997949 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ssh-key\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.997984 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnv57\" (UniqueName: \"kubernetes.io/projected/eb12d174-0920-408e-aa2f-ff2f07a1e005-kube-api-access-tnv57\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.998015 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-inventory\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.998055 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-bootstrap-combined-ca-bundle\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.998093 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-nova-combined-ca-bundle\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.998112 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-dhcp-combined-ca-bundle\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.998141 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-sriov-combined-ca-bundle\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:36 crc kubenswrapper[4962]: I1003 15:01:36.998194 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ceph\") pod \"eb12d174-0920-408e-aa2f-ff2f07a1e005\" (UID: \"eb12d174-0920-408e-aa2f-ff2f07a1e005\") " Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.004358 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.004818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.004852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.004963 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.005226 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.005787 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb12d174-0920-408e-aa2f-ff2f07a1e005-kube-api-access-tnv57" (OuterVolumeSpecName: "kube-api-access-tnv57") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "kube-api-access-tnv57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.005895 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ceph" (OuterVolumeSpecName: "ceph") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.006032 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.006968 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.007877 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.033565 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.034415 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-inventory" (OuterVolumeSpecName: "inventory") pod "eb12d174-0920-408e-aa2f-ff2f07a1e005" (UID: "eb12d174-0920-408e-aa2f-ff2f07a1e005"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101545 4962 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101576 4962 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101587 4962 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101597 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101606 4962 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101616 4962 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101624 4962 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101644 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101652 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101662 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnv57\" (UniqueName: \"kubernetes.io/projected/eb12d174-0920-408e-aa2f-ff2f07a1e005-kube-api-access-tnv57\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101670 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.101680 4962 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12d174-0920-408e-aa2f-ff2f07a1e005-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.226860 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:01:37 crc kubenswrapper[4962]: E1003 15:01:37.227385 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.536740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" event={"ID":"eb12d174-0920-408e-aa2f-ff2f07a1e005","Type":"ContainerDied","Data":"568b83821c27b26323d74937b13668a5f34309e91053d134b045cf3ff2d983f3"} Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.536789 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-ls7cn" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.536804 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="568b83821c27b26323d74937b13668a5f34309e91053d134b045cf3ff2d983f3" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.633893 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-xr76n"] Oct 03 15:01:37 crc kubenswrapper[4962]: E1003 15:01:37.634270 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb12d174-0920-408e-aa2f-ff2f07a1e005" containerName="install-certs-openstack-openstack-cell1" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.634287 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb12d174-0920-408e-aa2f-ff2f07a1e005" containerName="install-certs-openstack-openstack-cell1" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.634506 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb12d174-0920-408e-aa2f-ff2f07a1e005" containerName="install-certs-openstack-openstack-cell1" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.635222 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.637322 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.637384 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.637607 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.638179 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.657297 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-xr76n"] Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.713735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.713794 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dv9v\" (UniqueName: \"kubernetes.io/projected/9bdd019a-3501-490b-96cb-67a841f833c1-kube-api-access-2dv9v\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.714036 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.714132 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.816784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.816858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dv9v\" (UniqueName: \"kubernetes.io/projected/9bdd019a-3501-490b-96cb-67a841f833c1-kube-api-access-2dv9v\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.816961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.817014 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.822049 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.822180 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.822282 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.838527 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dv9v\" (UniqueName: \"kubernetes.io/projected/9bdd019a-3501-490b-96cb-67a841f833c1-kube-api-access-2dv9v\") pod \"ceph-client-openstack-openstack-cell1-xr76n\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:37 crc kubenswrapper[4962]: I1003 15:01:37.959798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:38 crc kubenswrapper[4962]: I1003 15:01:38.486058 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-xr76n"] Oct 03 15:01:38 crc kubenswrapper[4962]: I1003 15:01:38.545576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" event={"ID":"9bdd019a-3501-490b-96cb-67a841f833c1","Type":"ContainerStarted","Data":"99a732c47968ce92e3cb1c63928f2227226241187748f37a01915e0aa6e5a8dc"} Oct 03 15:01:39 crc kubenswrapper[4962]: I1003 15:01:39.554944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" event={"ID":"9bdd019a-3501-490b-96cb-67a841f833c1","Type":"ContainerStarted","Data":"40e8b96dc1040164213493aba7bd8176597c22614899ee2bd5eb93cf671605d6"} Oct 03 15:01:39 crc kubenswrapper[4962]: I1003 15:01:39.572209 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" podStartSLOduration=2.391281124 podStartE2EDuration="2.57219591s" podCreationTimestamp="2025-10-03 15:01:37 +0000 UTC" firstStartedPulling="2025-10-03 15:01:38.487799268 +0000 UTC m=+7906.891697103" lastFinishedPulling="2025-10-03 15:01:38.668714054 +0000 UTC m=+7907.072611889" observedRunningTime="2025-10-03 15:01:39.569230091 +0000 UTC m=+7907.973127936" watchObservedRunningTime="2025-10-03 15:01:39.57219591 +0000 UTC m=+7907.976093745" Oct 03 15:01:43 crc kubenswrapper[4962]: I1003 15:01:43.589991 4962 generic.go:334] "Generic (PLEG): container finished" podID="9bdd019a-3501-490b-96cb-67a841f833c1" containerID="40e8b96dc1040164213493aba7bd8176597c22614899ee2bd5eb93cf671605d6" exitCode=0 Oct 03 15:01:43 crc kubenswrapper[4962]: I1003 15:01:43.590186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" event={"ID":"9bdd019a-3501-490b-96cb-67a841f833c1","Type":"ContainerDied","Data":"40e8b96dc1040164213493aba7bd8176597c22614899ee2bd5eb93cf671605d6"} Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.084115 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.276045 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dv9v\" (UniqueName: \"kubernetes.io/projected/9bdd019a-3501-490b-96cb-67a841f833c1-kube-api-access-2dv9v\") pod \"9bdd019a-3501-490b-96cb-67a841f833c1\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.276149 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ssh-key\") pod \"9bdd019a-3501-490b-96cb-67a841f833c1\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.276417 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ceph\") pod \"9bdd019a-3501-490b-96cb-67a841f833c1\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.276740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-inventory\") pod \"9bdd019a-3501-490b-96cb-67a841f833c1\" (UID: \"9bdd019a-3501-490b-96cb-67a841f833c1\") " Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.282439 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ceph" (OuterVolumeSpecName: "ceph") pod "9bdd019a-3501-490b-96cb-67a841f833c1" (UID: "9bdd019a-3501-490b-96cb-67a841f833c1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.282879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bdd019a-3501-490b-96cb-67a841f833c1-kube-api-access-2dv9v" (OuterVolumeSpecName: "kube-api-access-2dv9v") pod "9bdd019a-3501-490b-96cb-67a841f833c1" (UID: "9bdd019a-3501-490b-96cb-67a841f833c1"). InnerVolumeSpecName "kube-api-access-2dv9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.309705 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-inventory" (OuterVolumeSpecName: "inventory") pod "9bdd019a-3501-490b-96cb-67a841f833c1" (UID: "9bdd019a-3501-490b-96cb-67a841f833c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.318551 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9bdd019a-3501-490b-96cb-67a841f833c1" (UID: "9bdd019a-3501-490b-96cb-67a841f833c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.379618 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.379670 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.379685 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dv9v\" (UniqueName: \"kubernetes.io/projected/9bdd019a-3501-490b-96cb-67a841f833c1-kube-api-access-2dv9v\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.379727 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9bdd019a-3501-490b-96cb-67a841f833c1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.612777 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" event={"ID":"9bdd019a-3501-490b-96cb-67a841f833c1","Type":"ContainerDied","Data":"99a732c47968ce92e3cb1c63928f2227226241187748f37a01915e0aa6e5a8dc"} Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.612834 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-xr76n" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.612839 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a732c47968ce92e3cb1c63928f2227226241187748f37a01915e0aa6e5a8dc" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.674904 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7w5ds"] Oct 03 15:01:45 crc kubenswrapper[4962]: E1003 15:01:45.675677 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdd019a-3501-490b-96cb-67a841f833c1" containerName="ceph-client-openstack-openstack-cell1" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.675709 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdd019a-3501-490b-96cb-67a841f833c1" containerName="ceph-client-openstack-openstack-cell1" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.676108 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bdd019a-3501-490b-96cb-67a841f833c1" containerName="ceph-client-openstack-openstack-cell1" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.677497 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.680879 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.681073 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.681194 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.681333 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.685412 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7w5ds"] Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.688604 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.787801 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ceph\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.787842 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ssh-key\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.787942 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsdg9\" (UniqueName: \"kubernetes.io/projected/0b1c4f80-f94d-401d-8b25-e89a06e993b4-kube-api-access-tsdg9\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.788016 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.788246 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-inventory\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.788449 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.890518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ceph\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.890571 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ssh-key\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.890725 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsdg9\" (UniqueName: \"kubernetes.io/projected/0b1c4f80-f94d-401d-8b25-e89a06e993b4-kube-api-access-tsdg9\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.890818 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.890900 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-inventory\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.891766 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.892505 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.894492 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ceph\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.896056 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ssh-key\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.897425 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.906180 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-inventory\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.910973 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsdg9\" (UniqueName: \"kubernetes.io/projected/0b1c4f80-f94d-401d-8b25-e89a06e993b4-kube-api-access-tsdg9\") pod \"ovn-openstack-openstack-cell1-7w5ds\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:45 crc kubenswrapper[4962]: I1003 15:01:45.997169 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:01:46 crc kubenswrapper[4962]: I1003 15:01:46.536946 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7w5ds"] Oct 03 15:01:46 crc kubenswrapper[4962]: I1003 15:01:46.622372 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7w5ds" event={"ID":"0b1c4f80-f94d-401d-8b25-e89a06e993b4","Type":"ContainerStarted","Data":"3baccce7ab2968536ab5ecf76e5627e8c6a7d1339d19e0ea726955145be48212"} Oct 03 15:01:47 crc kubenswrapper[4962]: I1003 15:01:47.644335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7w5ds" event={"ID":"0b1c4f80-f94d-401d-8b25-e89a06e993b4","Type":"ContainerStarted","Data":"b673a6b264a0a5d7fa1b795b2f8d6c2fbfe07e22bb821438bb0891d23c725d3a"} Oct 03 15:01:49 crc kubenswrapper[4962]: I1003 15:01:49.227332 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:01:49 crc kubenswrapper[4962]: E1003 15:01:49.228227 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:02:04 crc kubenswrapper[4962]: I1003 15:02:04.228556 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:02:04 crc kubenswrapper[4962]: E1003 15:02:04.230747 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:02:15 crc kubenswrapper[4962]: I1003 15:02:15.227972 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:02:15 crc kubenswrapper[4962]: E1003 15:02:15.229136 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:02:17 crc kubenswrapper[4962]: I1003 15:02:17.957209 4962 generic.go:334] "Generic (PLEG): container finished" podID="0b1c4f80-f94d-401d-8b25-e89a06e993b4" containerID="b673a6b264a0a5d7fa1b795b2f8d6c2fbfe07e22bb821438bb0891d23c725d3a" exitCode=2 Oct 03 15:02:17 crc kubenswrapper[4962]: I1003 15:02:17.957374 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7w5ds" event={"ID":"0b1c4f80-f94d-401d-8b25-e89a06e993b4","Type":"ContainerDied","Data":"b673a6b264a0a5d7fa1b795b2f8d6c2fbfe07e22bb821438bb0891d23c725d3a"} Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.598917 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.694209 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-inventory\") pod \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.695186 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovncontroller-config-0\") pod \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.695286 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ssh-key\") pod \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.695330 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsdg9\" (UniqueName: \"kubernetes.io/projected/0b1c4f80-f94d-401d-8b25-e89a06e993b4-kube-api-access-tsdg9\") pod \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.695437 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ceph\") pod \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.695535 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovn-combined-ca-bundle\") pod \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\" (UID: \"0b1c4f80-f94d-401d-8b25-e89a06e993b4\") " Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.700788 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0b1c4f80-f94d-401d-8b25-e89a06e993b4" (UID: "0b1c4f80-f94d-401d-8b25-e89a06e993b4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.700877 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ceph" (OuterVolumeSpecName: "ceph") pod "0b1c4f80-f94d-401d-8b25-e89a06e993b4" (UID: "0b1c4f80-f94d-401d-8b25-e89a06e993b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.706140 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1c4f80-f94d-401d-8b25-e89a06e993b4-kube-api-access-tsdg9" (OuterVolumeSpecName: "kube-api-access-tsdg9") pod "0b1c4f80-f94d-401d-8b25-e89a06e993b4" (UID: "0b1c4f80-f94d-401d-8b25-e89a06e993b4"). InnerVolumeSpecName "kube-api-access-tsdg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.725272 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "0b1c4f80-f94d-401d-8b25-e89a06e993b4" (UID: "0b1c4f80-f94d-401d-8b25-e89a06e993b4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.728102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-inventory" (OuterVolumeSpecName: "inventory") pod "0b1c4f80-f94d-401d-8b25-e89a06e993b4" (UID: "0b1c4f80-f94d-401d-8b25-e89a06e993b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.728155 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b1c4f80-f94d-401d-8b25-e89a06e993b4" (UID: "0b1c4f80-f94d-401d-8b25-e89a06e993b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.798225 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.798263 4962 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.798274 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.798283 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsdg9\" (UniqueName: \"kubernetes.io/projected/0b1c4f80-f94d-401d-8b25-e89a06e993b4-kube-api-access-tsdg9\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.798293 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.798301 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1c4f80-f94d-401d-8b25-e89a06e993b4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.985110 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7w5ds" event={"ID":"0b1c4f80-f94d-401d-8b25-e89a06e993b4","Type":"ContainerDied","Data":"3baccce7ab2968536ab5ecf76e5627e8c6a7d1339d19e0ea726955145be48212"} Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.985152 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3baccce7ab2968536ab5ecf76e5627e8c6a7d1339d19e0ea726955145be48212" Oct 03 15:02:19 crc kubenswrapper[4962]: I1003 15:02:19.985211 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7w5ds" Oct 03 15:02:26 crc kubenswrapper[4962]: I1003 15:02:26.227947 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:02:26 crc kubenswrapper[4962]: E1003 15:02:26.229326 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.030839 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-vrkcj"] Oct 03 15:02:27 crc kubenswrapper[4962]: E1003 15:02:27.031725 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1c4f80-f94d-401d-8b25-e89a06e993b4" containerName="ovn-openstack-openstack-cell1" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.031748 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1c4f80-f94d-401d-8b25-e89a06e993b4" containerName="ovn-openstack-openstack-cell1" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.032001 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1c4f80-f94d-401d-8b25-e89a06e993b4" containerName="ovn-openstack-openstack-cell1" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.034311 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.037076 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.037102 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.040470 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.040574 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.040471 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.053356 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-vrkcj"] Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.152735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ceph\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.152894 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xg8\" (UniqueName: \"kubernetes.io/projected/1a534445-267b-424d-bc4e-67a17558f09a-kube-api-access-g8xg8\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.152981 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ssh-key\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.153088 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1a534445-267b-424d-bc4e-67a17558f09a-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.153242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.153449 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-inventory\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.255458 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1a534445-267b-424d-bc4e-67a17558f09a-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.256182 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.256260 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-inventory\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.256363 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ceph\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.256478 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xg8\" (UniqueName: \"kubernetes.io/projected/1a534445-267b-424d-bc4e-67a17558f09a-kube-api-access-g8xg8\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.256574 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ssh-key\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.261376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1a534445-267b-424d-bc4e-67a17558f09a-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.262795 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ssh-key\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.269191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ceph\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.269740 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-inventory\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.270894 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.291578 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xg8\" (UniqueName: \"kubernetes.io/projected/1a534445-267b-424d-bc4e-67a17558f09a-kube-api-access-g8xg8\") pod \"ovn-openstack-openstack-cell1-vrkcj\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.368067 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:02:27 crc kubenswrapper[4962]: I1003 15:02:27.943794 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-vrkcj"] Oct 03 15:02:28 crc kubenswrapper[4962]: I1003 15:02:28.067335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" event={"ID":"1a534445-267b-424d-bc4e-67a17558f09a","Type":"ContainerStarted","Data":"41dd456c33d428fe480d38c2c0421ee90360852647870a0ee55ff65709743efb"} Oct 03 15:02:29 crc kubenswrapper[4962]: I1003 15:02:29.077443 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" event={"ID":"1a534445-267b-424d-bc4e-67a17558f09a","Type":"ContainerStarted","Data":"4ba1b4d9ed11ec9d2948e7b4290bb0a5eb166f6afb366449bab1d95e2f173161"} Oct 03 15:02:29 crc kubenswrapper[4962]: I1003 15:02:29.104977 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" podStartSLOduration=1.908287426 podStartE2EDuration="2.104947661s" podCreationTimestamp="2025-10-03 15:02:27 +0000 UTC" firstStartedPulling="2025-10-03 15:02:27.95157098 +0000 UTC m=+7956.355468815" lastFinishedPulling="2025-10-03 15:02:28.148231215 +0000 UTC m=+7956.552129050" observedRunningTime="2025-10-03 15:02:29.098037237 +0000 UTC m=+7957.501935072" watchObservedRunningTime="2025-10-03 15:02:29.104947661 +0000 UTC m=+7957.508845516" Oct 03 15:02:37 crc kubenswrapper[4962]: I1003 15:02:37.228523 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:02:37 crc kubenswrapper[4962]: E1003 15:02:37.231447 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:02:48 crc kubenswrapper[4962]: I1003 15:02:48.228050 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:02:48 crc kubenswrapper[4962]: E1003 15:02:48.229098 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:02:59 crc kubenswrapper[4962]: I1003 15:02:59.377973 4962 generic.go:334] "Generic (PLEG): container finished" podID="1a534445-267b-424d-bc4e-67a17558f09a" containerID="4ba1b4d9ed11ec9d2948e7b4290bb0a5eb166f6afb366449bab1d95e2f173161" exitCode=2 Oct 03 15:02:59 crc kubenswrapper[4962]: I1003 15:02:59.378418 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" event={"ID":"1a534445-267b-424d-bc4e-67a17558f09a","Type":"ContainerDied","Data":"4ba1b4d9ed11ec9d2948e7b4290bb0a5eb166f6afb366449bab1d95e2f173161"} Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.863681 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.984499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ssh-key\") pod \"1a534445-267b-424d-bc4e-67a17558f09a\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.984833 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ceph\") pod \"1a534445-267b-424d-bc4e-67a17558f09a\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.984958 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ovn-combined-ca-bundle\") pod \"1a534445-267b-424d-bc4e-67a17558f09a\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.985008 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8xg8\" (UniqueName: \"kubernetes.io/projected/1a534445-267b-424d-bc4e-67a17558f09a-kube-api-access-g8xg8\") pod \"1a534445-267b-424d-bc4e-67a17558f09a\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.985081 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-inventory\") pod \"1a534445-267b-424d-bc4e-67a17558f09a\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.985138 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1a534445-267b-424d-bc4e-67a17558f09a-ovncontroller-config-0\") pod \"1a534445-267b-424d-bc4e-67a17558f09a\" (UID: \"1a534445-267b-424d-bc4e-67a17558f09a\") " Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.990161 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a534445-267b-424d-bc4e-67a17558f09a-kube-api-access-g8xg8" (OuterVolumeSpecName: "kube-api-access-g8xg8") pod "1a534445-267b-424d-bc4e-67a17558f09a" (UID: "1a534445-267b-424d-bc4e-67a17558f09a"). InnerVolumeSpecName "kube-api-access-g8xg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.990233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ceph" (OuterVolumeSpecName: "ceph") pod "1a534445-267b-424d-bc4e-67a17558f09a" (UID: "1a534445-267b-424d-bc4e-67a17558f09a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:00 crc kubenswrapper[4962]: I1003 15:03:00.992771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1a534445-267b-424d-bc4e-67a17558f09a" (UID: "1a534445-267b-424d-bc4e-67a17558f09a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.013016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a534445-267b-424d-bc4e-67a17558f09a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1a534445-267b-424d-bc4e-67a17558f09a" (UID: "1a534445-267b-424d-bc4e-67a17558f09a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.018782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-inventory" (OuterVolumeSpecName: "inventory") pod "1a534445-267b-424d-bc4e-67a17558f09a" (UID: "1a534445-267b-424d-bc4e-67a17558f09a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.019584 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1a534445-267b-424d-bc4e-67a17558f09a" (UID: "1a534445-267b-424d-bc4e-67a17558f09a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.087718 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.087764 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.087778 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.087796 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8xg8\" (UniqueName: \"kubernetes.io/projected/1a534445-267b-424d-bc4e-67a17558f09a-kube-api-access-g8xg8\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.087807 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a534445-267b-424d-bc4e-67a17558f09a-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.087820 4962 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1a534445-267b-424d-bc4e-67a17558f09a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.227028 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:03:01 crc kubenswrapper[4962]: E1003 15:03:01.227355 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.404852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" event={"ID":"1a534445-267b-424d-bc4e-67a17558f09a","Type":"ContainerDied","Data":"41dd456c33d428fe480d38c2c0421ee90360852647870a0ee55ff65709743efb"} Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.405219 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41dd456c33d428fe480d38c2c0421ee90360852647870a0ee55ff65709743efb" Oct 03 15:03:01 crc kubenswrapper[4962]: I1003 15:03:01.404904 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-vrkcj" Oct 03 15:03:12 crc kubenswrapper[4962]: I1003 15:03:12.235243 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:03:12 crc kubenswrapper[4962]: E1003 15:03:12.236243 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.077578 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-m76lg"] Oct 03 15:03:19 crc kubenswrapper[4962]: E1003 15:03:19.078581 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a534445-267b-424d-bc4e-67a17558f09a" containerName="ovn-openstack-openstack-cell1" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.078592 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a534445-267b-424d-bc4e-67a17558f09a" containerName="ovn-openstack-openstack-cell1" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.078865 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a534445-267b-424d-bc4e-67a17558f09a" containerName="ovn-openstack-openstack-cell1" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.079749 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.085312 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.085614 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.085782 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.086180 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.087897 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.090337 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-m76lg"] Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.131730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-inventory\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.131774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.131807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ceph\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.131879 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.131898 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj64b\" (UniqueName: \"kubernetes.io/projected/811b7680-d11b-4c89-a8bd-d217aa5226ac-kube-api-access-nj64b\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.131986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ssh-key\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.234621 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ssh-key\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.234706 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-inventory\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.234744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.234787 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ceph\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.234898 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.234924 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj64b\" (UniqueName: \"kubernetes.io/projected/811b7680-d11b-4c89-a8bd-d217aa5226ac-kube-api-access-nj64b\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.235941 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.242190 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ssh-key\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.242622 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-inventory\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.242977 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ceph\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.243797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.254110 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj64b\" (UniqueName: \"kubernetes.io/projected/811b7680-d11b-4c89-a8bd-d217aa5226ac-kube-api-access-nj64b\") pod \"ovn-openstack-openstack-cell1-m76lg\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:19 crc kubenswrapper[4962]: I1003 15:03:19.415962 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:20 crc kubenswrapper[4962]: I1003 15:03:20.034602 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-m76lg"] Oct 03 15:03:20 crc kubenswrapper[4962]: I1003 15:03:20.627213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-m76lg" event={"ID":"811b7680-d11b-4c89-a8bd-d217aa5226ac","Type":"ContainerStarted","Data":"41fffbf1e2711f6cceb756c9c25b7cedbba6311bcce489476ccef99b31dd95e1"} Oct 03 15:03:20 crc kubenswrapper[4962]: I1003 15:03:20.627959 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-m76lg" event={"ID":"811b7680-d11b-4c89-a8bd-d217aa5226ac","Type":"ContainerStarted","Data":"6ba4454140687805cd62832273ec992d2c6a9e506f9deeaa3e1566ecb58c25b2"} Oct 03 15:03:20 crc kubenswrapper[4962]: I1003 15:03:20.645686 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-m76lg" podStartSLOduration=1.436136464 podStartE2EDuration="1.645670543s" podCreationTimestamp="2025-10-03 15:03:19 +0000 UTC" firstStartedPulling="2025-10-03 15:03:20.050976132 +0000 UTC m=+8008.454874007" lastFinishedPulling="2025-10-03 15:03:20.260510231 +0000 UTC m=+8008.664408086" observedRunningTime="2025-10-03 15:03:20.644610745 +0000 UTC m=+8009.048508580" watchObservedRunningTime="2025-10-03 15:03:20.645670543 +0000 UTC m=+8009.049568378" Oct 03 15:03:27 crc kubenswrapper[4962]: I1003 15:03:27.227726 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:03:27 crc kubenswrapper[4962]: E1003 15:03:27.228253 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:03:41 crc kubenswrapper[4962]: I1003 15:03:41.227565 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:03:41 crc kubenswrapper[4962]: E1003 15:03:41.228265 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:03:48 crc kubenswrapper[4962]: I1003 15:03:48.939525 4962 generic.go:334] "Generic (PLEG): container finished" podID="811b7680-d11b-4c89-a8bd-d217aa5226ac" containerID="41fffbf1e2711f6cceb756c9c25b7cedbba6311bcce489476ccef99b31dd95e1" exitCode=2 Oct 03 15:03:48 crc kubenswrapper[4962]: I1003 15:03:48.939605 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-m76lg" event={"ID":"811b7680-d11b-4c89-a8bd-d217aa5226ac","Type":"ContainerDied","Data":"41fffbf1e2711f6cceb756c9c25b7cedbba6311bcce489476ccef99b31dd95e1"} Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.413554 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.507739 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovn-combined-ca-bundle\") pod \"811b7680-d11b-4c89-a8bd-d217aa5226ac\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.507805 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj64b\" (UniqueName: \"kubernetes.io/projected/811b7680-d11b-4c89-a8bd-d217aa5226ac-kube-api-access-nj64b\") pod \"811b7680-d11b-4c89-a8bd-d217aa5226ac\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.507888 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovncontroller-config-0\") pod \"811b7680-d11b-4c89-a8bd-d217aa5226ac\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.508005 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ceph\") pod \"811b7680-d11b-4c89-a8bd-d217aa5226ac\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.508128 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-inventory\") pod \"811b7680-d11b-4c89-a8bd-d217aa5226ac\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.508218 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ssh-key\") pod \"811b7680-d11b-4c89-a8bd-d217aa5226ac\" (UID: \"811b7680-d11b-4c89-a8bd-d217aa5226ac\") " Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.513854 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "811b7680-d11b-4c89-a8bd-d217aa5226ac" (UID: "811b7680-d11b-4c89-a8bd-d217aa5226ac"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.514058 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811b7680-d11b-4c89-a8bd-d217aa5226ac-kube-api-access-nj64b" (OuterVolumeSpecName: "kube-api-access-nj64b") pod "811b7680-d11b-4c89-a8bd-d217aa5226ac" (UID: "811b7680-d11b-4c89-a8bd-d217aa5226ac"). InnerVolumeSpecName "kube-api-access-nj64b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.517523 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ceph" (OuterVolumeSpecName: "ceph") pod "811b7680-d11b-4c89-a8bd-d217aa5226ac" (UID: "811b7680-d11b-4c89-a8bd-d217aa5226ac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.533892 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "811b7680-d11b-4c89-a8bd-d217aa5226ac" (UID: "811b7680-d11b-4c89-a8bd-d217aa5226ac"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.541925 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-inventory" (OuterVolumeSpecName: "inventory") pod "811b7680-d11b-4c89-a8bd-d217aa5226ac" (UID: "811b7680-d11b-4c89-a8bd-d217aa5226ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.545550 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "811b7680-d11b-4c89-a8bd-d217aa5226ac" (UID: "811b7680-d11b-4c89-a8bd-d217aa5226ac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.610249 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.610286 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.610301 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj64b\" (UniqueName: \"kubernetes.io/projected/811b7680-d11b-4c89-a8bd-d217aa5226ac-kube-api-access-nj64b\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.610312 4962 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/811b7680-d11b-4c89-a8bd-d217aa5226ac-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.610323 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.610332 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/811b7680-d11b-4c89-a8bd-d217aa5226ac-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.958920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-m76lg" event={"ID":"811b7680-d11b-4c89-a8bd-d217aa5226ac","Type":"ContainerDied","Data":"6ba4454140687805cd62832273ec992d2c6a9e506f9deeaa3e1566ecb58c25b2"} Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.958971 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba4454140687805cd62832273ec992d2c6a9e506f9deeaa3e1566ecb58c25b2" Oct 03 15:03:50 crc kubenswrapper[4962]: I1003 15:03:50.958974 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-m76lg" Oct 03 15:03:52 crc kubenswrapper[4962]: I1003 15:03:52.234619 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:03:52 crc kubenswrapper[4962]: E1003 15:03:52.235377 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:04:06 crc kubenswrapper[4962]: I1003 15:04:06.227886 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:04:06 crc kubenswrapper[4962]: E1003 15:04:06.228778 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:04:19 crc kubenswrapper[4962]: I1003 15:04:19.227783 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:04:19 crc kubenswrapper[4962]: E1003 15:04:19.228818 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.033714 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-rh2r4"] Oct 03 15:04:28 crc kubenswrapper[4962]: E1003 15:04:28.034732 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811b7680-d11b-4c89-a8bd-d217aa5226ac" containerName="ovn-openstack-openstack-cell1" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.034749 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="811b7680-d11b-4c89-a8bd-d217aa5226ac" containerName="ovn-openstack-openstack-cell1" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.035037 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="811b7680-d11b-4c89-a8bd-d217aa5226ac" containerName="ovn-openstack-openstack-cell1" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.036025 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.039551 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.039749 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.039600 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-98wnm" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.040435 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.040626 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.047562 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-rh2r4"] Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.148858 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ceph\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.149193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.149406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-inventory\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.149592 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.149846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ssh-key\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.150049 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqjqt\" (UniqueName: \"kubernetes.io/projected/b6f674e6-6785-465e-b8ba-85c0efe46fe5-kube-api-access-tqjqt\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.252129 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.252201 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ssh-key\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.252255 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqjqt\" (UniqueName: \"kubernetes.io/projected/b6f674e6-6785-465e-b8ba-85c0efe46fe5-kube-api-access-tqjqt\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.252307 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ceph\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.252350 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.252384 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-inventory\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.257473 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.266696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ssh-key\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.281259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-inventory\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.290311 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ceph\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.291104 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.291469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqjqt\" (UniqueName: \"kubernetes.io/projected/b6f674e6-6785-465e-b8ba-85c0efe46fe5-kube-api-access-tqjqt\") pod \"ovn-openstack-openstack-cell1-rh2r4\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.354355 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.857992 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-rh2r4"] Oct 03 15:04:28 crc kubenswrapper[4962]: I1003 15:04:28.858629 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:04:29 crc kubenswrapper[4962]: I1003 15:04:29.340437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" event={"ID":"b6f674e6-6785-465e-b8ba-85c0efe46fe5","Type":"ContainerStarted","Data":"ec31e6f874c38bbb1568866effc59ce607788d4a457ba828c72145d0dba053cc"} Oct 03 15:04:29 crc kubenswrapper[4962]: I1003 15:04:29.349838 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" event={"ID":"b6f674e6-6785-465e-b8ba-85c0efe46fe5","Type":"ContainerStarted","Data":"f2cd0e5357fc16873091de57e9ff68bd1775bc415f0471280ec3a3fd2593c19e"} Oct 03 15:04:29 crc kubenswrapper[4962]: I1003 15:04:29.377267 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" podStartSLOduration=1.197185558 podStartE2EDuration="1.37724312s" podCreationTimestamp="2025-10-03 15:04:28 +0000 UTC" firstStartedPulling="2025-10-03 15:04:28.858421312 +0000 UTC m=+8077.262319147" lastFinishedPulling="2025-10-03 15:04:29.038478874 +0000 UTC m=+8077.442376709" observedRunningTime="2025-10-03 15:04:29.367225112 +0000 UTC m=+8077.771122947" watchObservedRunningTime="2025-10-03 15:04:29.37724312 +0000 UTC m=+8077.781140955" Oct 03 15:04:33 crc kubenswrapper[4962]: I1003 15:04:33.227342 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:04:33 crc kubenswrapper[4962]: E1003 15:04:33.228120 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:04:46 crc kubenswrapper[4962]: I1003 15:04:46.227471 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:04:46 crc kubenswrapper[4962]: E1003 15:04:46.228203 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:04:59 crc kubenswrapper[4962]: I1003 15:04:59.646627 4962 generic.go:334] "Generic (PLEG): container finished" podID="b6f674e6-6785-465e-b8ba-85c0efe46fe5" containerID="ec31e6f874c38bbb1568866effc59ce607788d4a457ba828c72145d0dba053cc" exitCode=2 Oct 03 15:04:59 crc kubenswrapper[4962]: I1003 15:04:59.646683 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" event={"ID":"b6f674e6-6785-465e-b8ba-85c0efe46fe5","Type":"ContainerDied","Data":"ec31e6f874c38bbb1568866effc59ce607788d4a457ba828c72145d0dba053cc"} Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.093116 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.162894 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovncontroller-config-0\") pod \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.163053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ssh-key\") pod \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.163096 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-inventory\") pod \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.163212 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ceph\") pod \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.163419 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqjqt\" (UniqueName: \"kubernetes.io/projected/b6f674e6-6785-465e-b8ba-85c0efe46fe5-kube-api-access-tqjqt\") pod \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.163830 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovn-combined-ca-bundle\") pod \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\" (UID: \"b6f674e6-6785-465e-b8ba-85c0efe46fe5\") " Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.169978 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b6f674e6-6785-465e-b8ba-85c0efe46fe5" (UID: "b6f674e6-6785-465e-b8ba-85c0efe46fe5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.170188 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ceph" (OuterVolumeSpecName: "ceph") pod "b6f674e6-6785-465e-b8ba-85c0efe46fe5" (UID: "b6f674e6-6785-465e-b8ba-85c0efe46fe5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.173381 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f674e6-6785-465e-b8ba-85c0efe46fe5-kube-api-access-tqjqt" (OuterVolumeSpecName: "kube-api-access-tqjqt") pod "b6f674e6-6785-465e-b8ba-85c0efe46fe5" (UID: "b6f674e6-6785-465e-b8ba-85c0efe46fe5"). InnerVolumeSpecName "kube-api-access-tqjqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.197237 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b6f674e6-6785-465e-b8ba-85c0efe46fe5" (UID: "b6f674e6-6785-465e-b8ba-85c0efe46fe5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.199594 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-inventory" (OuterVolumeSpecName: "inventory") pod "b6f674e6-6785-465e-b8ba-85c0efe46fe5" (UID: "b6f674e6-6785-465e-b8ba-85c0efe46fe5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.204947 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b6f674e6-6785-465e-b8ba-85c0efe46fe5" (UID: "b6f674e6-6785-465e-b8ba-85c0efe46fe5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.229186 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:05:01 crc kubenswrapper[4962]: E1003 15:05:01.229707 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.266336 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqjqt\" (UniqueName: \"kubernetes.io/projected/b6f674e6-6785-465e-b8ba-85c0efe46fe5-kube-api-access-tqjqt\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.266522 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.266604 4962 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.266682 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.266740 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.266791 4962 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6f674e6-6785-465e-b8ba-85c0efe46fe5-ceph\") on node \"crc\" DevicePath \"\"" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.667015 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" event={"ID":"b6f674e6-6785-465e-b8ba-85c0efe46fe5","Type":"ContainerDied","Data":"f2cd0e5357fc16873091de57e9ff68bd1775bc415f0471280ec3a3fd2593c19e"} Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.667069 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2cd0e5357fc16873091de57e9ff68bd1775bc415f0471280ec3a3fd2593c19e" Oct 03 15:05:01 crc kubenswrapper[4962]: I1003 15:05:01.667148 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-rh2r4" Oct 03 15:05:13 crc kubenswrapper[4962]: I1003 15:05:13.227084 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:05:13 crc kubenswrapper[4962]: E1003 15:05:13.227976 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:05:27 crc kubenswrapper[4962]: I1003 15:05:27.227910 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:05:27 crc kubenswrapper[4962]: E1003 15:05:27.228783 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:05:41 crc kubenswrapper[4962]: I1003 15:05:41.228840 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:05:41 crc kubenswrapper[4962]: E1003 15:05:41.229865 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:05:53 crc kubenswrapper[4962]: I1003 15:05:53.227782 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:05:53 crc kubenswrapper[4962]: E1003 15:05:53.228806 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.557402 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qgjqc/must-gather-dwknh"] Oct 03 15:06:00 crc kubenswrapper[4962]: E1003 15:06:00.560546 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f674e6-6785-465e-b8ba-85c0efe46fe5" containerName="ovn-openstack-openstack-cell1" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.560704 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f674e6-6785-465e-b8ba-85c0efe46fe5" containerName="ovn-openstack-openstack-cell1" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.561050 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f674e6-6785-465e-b8ba-85c0efe46fe5" containerName="ovn-openstack-openstack-cell1" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.563251 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.573926 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qgjqc"/"default-dockercfg-rdfpw" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.574570 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qgjqc"/"openshift-service-ca.crt" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.575745 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qgjqc"/"kube-root-ca.crt" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.578776 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qgjqc/must-gather-dwknh"] Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.707802 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/12468d83-dfda-49d1-8184-efe2d4226e3c-must-gather-output\") pod \"must-gather-dwknh\" (UID: \"12468d83-dfda-49d1-8184-efe2d4226e3c\") " pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.708406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjm9s\" (UniqueName: \"kubernetes.io/projected/12468d83-dfda-49d1-8184-efe2d4226e3c-kube-api-access-xjm9s\") pod \"must-gather-dwknh\" (UID: \"12468d83-dfda-49d1-8184-efe2d4226e3c\") " pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.810939 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjm9s\" (UniqueName: \"kubernetes.io/projected/12468d83-dfda-49d1-8184-efe2d4226e3c-kube-api-access-xjm9s\") pod \"must-gather-dwknh\" (UID: \"12468d83-dfda-49d1-8184-efe2d4226e3c\") " pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.811100 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/12468d83-dfda-49d1-8184-efe2d4226e3c-must-gather-output\") pod \"must-gather-dwknh\" (UID: \"12468d83-dfda-49d1-8184-efe2d4226e3c\") " pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.811571 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/12468d83-dfda-49d1-8184-efe2d4226e3c-must-gather-output\") pod \"must-gather-dwknh\" (UID: \"12468d83-dfda-49d1-8184-efe2d4226e3c\") " pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.843485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjm9s\" (UniqueName: \"kubernetes.io/projected/12468d83-dfda-49d1-8184-efe2d4226e3c-kube-api-access-xjm9s\") pod \"must-gather-dwknh\" (UID: \"12468d83-dfda-49d1-8184-efe2d4226e3c\") " pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:06:00 crc kubenswrapper[4962]: I1003 15:06:00.892239 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:06:01 crc kubenswrapper[4962]: I1003 15:06:01.383899 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qgjqc/must-gather-dwknh"] Oct 03 15:06:02 crc kubenswrapper[4962]: I1003 15:06:02.286107 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/must-gather-dwknh" event={"ID":"12468d83-dfda-49d1-8184-efe2d4226e3c","Type":"ContainerStarted","Data":"bbfefae4c2a5823da68a5be5113cf108c8f42d5115d1e1040c435b1d1333626d"} Oct 03 15:06:05 crc kubenswrapper[4962]: I1003 15:06:05.227893 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:06:05 crc kubenswrapper[4962]: E1003 15:06:05.228774 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:06:06 crc kubenswrapper[4962]: I1003 15:06:06.329662 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/must-gather-dwknh" event={"ID":"12468d83-dfda-49d1-8184-efe2d4226e3c","Type":"ContainerStarted","Data":"607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c"} Oct 03 15:06:06 crc kubenswrapper[4962]: I1003 15:06:06.330235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/must-gather-dwknh" event={"ID":"12468d83-dfda-49d1-8184-efe2d4226e3c","Type":"ContainerStarted","Data":"abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786"} Oct 03 15:06:08 crc kubenswrapper[4962]: I1003 15:06:08.838270 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qgjqc/must-gather-dwknh" podStartSLOduration=5.148126205 podStartE2EDuration="8.83823904s" podCreationTimestamp="2025-10-03 15:06:00 +0000 UTC" firstStartedPulling="2025-10-03 15:06:01.391722064 +0000 UTC m=+8169.795619899" lastFinishedPulling="2025-10-03 15:06:05.081834899 +0000 UTC m=+8173.485732734" observedRunningTime="2025-10-03 15:06:06.344305375 +0000 UTC m=+8174.748203220" watchObservedRunningTime="2025-10-03 15:06:08.83823904 +0000 UTC m=+8177.242136875" Oct 03 15:06:08 crc kubenswrapper[4962]: I1003 15:06:08.840262 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-42rd4"] Oct 03 15:06:08 crc kubenswrapper[4962]: I1003 15:06:08.843143 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:08 crc kubenswrapper[4962]: I1003 15:06:08.850564 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42rd4"] Oct 03 15:06:08 crc kubenswrapper[4962]: I1003 15:06:08.988939 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-catalog-content\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:08 crc kubenswrapper[4962]: I1003 15:06:08.989340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sl86\" (UniqueName: \"kubernetes.io/projected/656d8f62-9bad-43c6-950e-29614affab2b-kube-api-access-2sl86\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:08 crc kubenswrapper[4962]: I1003 15:06:08.989821 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-utilities\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:09 crc kubenswrapper[4962]: I1003 15:06:09.091661 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-catalog-content\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:09 crc kubenswrapper[4962]: I1003 15:06:09.091720 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sl86\" (UniqueName: \"kubernetes.io/projected/656d8f62-9bad-43c6-950e-29614affab2b-kube-api-access-2sl86\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:09 crc kubenswrapper[4962]: I1003 15:06:09.091827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-utilities\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:09 crc kubenswrapper[4962]: I1003 15:06:09.092163 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-catalog-content\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:09 crc kubenswrapper[4962]: I1003 15:06:09.092193 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-utilities\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:09 crc kubenswrapper[4962]: I1003 15:06:09.122075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sl86\" (UniqueName: \"kubernetes.io/projected/656d8f62-9bad-43c6-950e-29614affab2b-kube-api-access-2sl86\") pod \"certified-operators-42rd4\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:09 crc kubenswrapper[4962]: I1003 15:06:09.170199 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:09 crc kubenswrapper[4962]: I1003 15:06:09.751111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42rd4"] Oct 03 15:06:10 crc kubenswrapper[4962]: I1003 15:06:10.374113 4962 generic.go:334] "Generic (PLEG): container finished" podID="656d8f62-9bad-43c6-950e-29614affab2b" containerID="140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb" exitCode=0 Oct 03 15:06:10 crc kubenswrapper[4962]: I1003 15:06:10.374158 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42rd4" event={"ID":"656d8f62-9bad-43c6-950e-29614affab2b","Type":"ContainerDied","Data":"140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb"} Oct 03 15:06:10 crc kubenswrapper[4962]: I1003 15:06:10.374708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42rd4" event={"ID":"656d8f62-9bad-43c6-950e-29614affab2b","Type":"ContainerStarted","Data":"d3f2d71b42e12b193b73b6d85988f661480bd7734743d234c35de2308d629221"} Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.400617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42rd4" event={"ID":"656d8f62-9bad-43c6-950e-29614affab2b","Type":"ContainerStarted","Data":"bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba"} Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.459050 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-mll9k"] Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.460751 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.544726 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-host\") pod \"crc-debug-mll9k\" (UID: \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\") " pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.545142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58wqs\" (UniqueName: \"kubernetes.io/projected/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-kube-api-access-58wqs\") pod \"crc-debug-mll9k\" (UID: \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\") " pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.647851 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-host\") pod \"crc-debug-mll9k\" (UID: \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\") " pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.647912 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58wqs\" (UniqueName: \"kubernetes.io/projected/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-kube-api-access-58wqs\") pod \"crc-debug-mll9k\" (UID: \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\") " pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.648066 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-host\") pod \"crc-debug-mll9k\" (UID: \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\") " pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.715667 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58wqs\" (UniqueName: \"kubernetes.io/projected/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-kube-api-access-58wqs\") pod \"crc-debug-mll9k\" (UID: \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\") " pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:06:11 crc kubenswrapper[4962]: I1003 15:06:11.825234 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:06:12 crc kubenswrapper[4962]: I1003 15:06:12.410833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-mll9k" event={"ID":"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944","Type":"ContainerStarted","Data":"3584439fe8bce882041ddeb4b799b4a3f447f33d2c685c898782cbcaa3f65f2f"} Oct 03 15:06:15 crc kubenswrapper[4962]: I1003 15:06:15.440322 4962 generic.go:334] "Generic (PLEG): container finished" podID="656d8f62-9bad-43c6-950e-29614affab2b" containerID="bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba" exitCode=0 Oct 03 15:06:15 crc kubenswrapper[4962]: I1003 15:06:15.440390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42rd4" event={"ID":"656d8f62-9bad-43c6-950e-29614affab2b","Type":"ContainerDied","Data":"bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba"} Oct 03 15:06:16 crc kubenswrapper[4962]: I1003 15:06:16.451340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42rd4" event={"ID":"656d8f62-9bad-43c6-950e-29614affab2b","Type":"ContainerStarted","Data":"69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df"} Oct 03 15:06:16 crc kubenswrapper[4962]: I1003 15:06:16.497571 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-42rd4" podStartSLOduration=2.934307866 podStartE2EDuration="8.497551076s" podCreationTimestamp="2025-10-03 15:06:08 +0000 UTC" firstStartedPulling="2025-10-03 15:06:10.376052182 +0000 UTC m=+8178.779950007" lastFinishedPulling="2025-10-03 15:06:15.939295382 +0000 UTC m=+8184.343193217" observedRunningTime="2025-10-03 15:06:16.489490681 +0000 UTC m=+8184.893388516" watchObservedRunningTime="2025-10-03 15:06:16.497551076 +0000 UTC m=+8184.901448911" Oct 03 15:06:18 crc kubenswrapper[4962]: I1003 15:06:18.226951 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:06:18 crc kubenswrapper[4962]: E1003 15:06:18.227742 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:06:19 crc kubenswrapper[4962]: I1003 15:06:19.171372 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:19 crc kubenswrapper[4962]: I1003 15:06:19.171426 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:19 crc kubenswrapper[4962]: I1003 15:06:19.229691 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:26 crc kubenswrapper[4962]: I1003 15:06:26.552739 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-mll9k" event={"ID":"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944","Type":"ContainerStarted","Data":"603a053ac2208b0c97c5b4e82974ebc5c17ba4834399cb88065d571d500a909b"} Oct 03 15:06:26 crc kubenswrapper[4962]: I1003 15:06:26.587524 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qgjqc/crc-debug-mll9k" podStartSLOduration=1.536044959 podStartE2EDuration="15.587503959s" podCreationTimestamp="2025-10-03 15:06:11 +0000 UTC" firstStartedPulling="2025-10-03 15:06:11.865654989 +0000 UTC m=+8180.269552824" lastFinishedPulling="2025-10-03 15:06:25.917113989 +0000 UTC m=+8194.321011824" observedRunningTime="2025-10-03 15:06:26.575449818 +0000 UTC m=+8194.979347643" watchObservedRunningTime="2025-10-03 15:06:26.587503959 +0000 UTC m=+8194.991401794" Oct 03 15:06:29 crc kubenswrapper[4962]: I1003 15:06:29.249325 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:29 crc kubenswrapper[4962]: I1003 15:06:29.304497 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42rd4"] Oct 03 15:06:29 crc kubenswrapper[4962]: I1003 15:06:29.579974 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-42rd4" podUID="656d8f62-9bad-43c6-950e-29614affab2b" containerName="registry-server" containerID="cri-o://69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df" gracePeriod=2 Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.171919 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.336699 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-catalog-content\") pod \"656d8f62-9bad-43c6-950e-29614affab2b\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.337310 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-utilities\") pod \"656d8f62-9bad-43c6-950e-29614affab2b\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.337363 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sl86\" (UniqueName: \"kubernetes.io/projected/656d8f62-9bad-43c6-950e-29614affab2b-kube-api-access-2sl86\") pod \"656d8f62-9bad-43c6-950e-29614affab2b\" (UID: \"656d8f62-9bad-43c6-950e-29614affab2b\") " Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.337736 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-utilities" (OuterVolumeSpecName: "utilities") pod "656d8f62-9bad-43c6-950e-29614affab2b" (UID: "656d8f62-9bad-43c6-950e-29614affab2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.338091 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.344071 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/656d8f62-9bad-43c6-950e-29614affab2b-kube-api-access-2sl86" (OuterVolumeSpecName: "kube-api-access-2sl86") pod "656d8f62-9bad-43c6-950e-29614affab2b" (UID: "656d8f62-9bad-43c6-950e-29614affab2b"). InnerVolumeSpecName "kube-api-access-2sl86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.394315 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "656d8f62-9bad-43c6-950e-29614affab2b" (UID: "656d8f62-9bad-43c6-950e-29614affab2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.440968 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d8f62-9bad-43c6-950e-29614affab2b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.441007 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sl86\" (UniqueName: \"kubernetes.io/projected/656d8f62-9bad-43c6-950e-29614affab2b-kube-api-access-2sl86\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.592759 4962 generic.go:334] "Generic (PLEG): container finished" podID="656d8f62-9bad-43c6-950e-29614affab2b" containerID="69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df" exitCode=0 Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.592807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42rd4" event={"ID":"656d8f62-9bad-43c6-950e-29614affab2b","Type":"ContainerDied","Data":"69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df"} Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.592835 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42rd4" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.592855 4962 scope.go:117] "RemoveContainer" containerID="69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.592840 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42rd4" event={"ID":"656d8f62-9bad-43c6-950e-29614affab2b","Type":"ContainerDied","Data":"d3f2d71b42e12b193b73b6d85988f661480bd7734743d234c35de2308d629221"} Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.621918 4962 scope.go:117] "RemoveContainer" containerID="bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.630051 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42rd4"] Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.647953 4962 scope.go:117] "RemoveContainer" containerID="140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.648348 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-42rd4"] Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.696477 4962 scope.go:117] "RemoveContainer" containerID="69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df" Oct 03 15:06:30 crc kubenswrapper[4962]: E1003 15:06:30.697067 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df\": container with ID starting with 69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df not found: ID does not exist" containerID="69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.697106 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df"} err="failed to get container status \"69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df\": rpc error: code = NotFound desc = could not find container \"69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df\": container with ID starting with 69fb1119eb8798349bc9f5f913437b53b76a5a411b7ae3f169f5ceb82dd166df not found: ID does not exist" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.697133 4962 scope.go:117] "RemoveContainer" containerID="bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba" Oct 03 15:06:30 crc kubenswrapper[4962]: E1003 15:06:30.697512 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba\": container with ID starting with bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba not found: ID does not exist" containerID="bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.697546 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba"} err="failed to get container status \"bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba\": rpc error: code = NotFound desc = could not find container \"bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba\": container with ID starting with bd5b48b366f02d21b41095be84ceb8107dce6181b25e1c86a9a2d3cd727cb7ba not found: ID does not exist" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.697565 4962 scope.go:117] "RemoveContainer" containerID="140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb" Oct 03 15:06:30 crc kubenswrapper[4962]: E1003 15:06:30.698041 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb\": container with ID starting with 140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb not found: ID does not exist" containerID="140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb" Oct 03 15:06:30 crc kubenswrapper[4962]: I1003 15:06:30.698087 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb"} err="failed to get container status \"140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb\": rpc error: code = NotFound desc = could not find container \"140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb\": container with ID starting with 140f38c9eae199c690b714e375992337d943331f688ecc92b6d631567ec24fcb not found: ID does not exist" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.227255 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.492739 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6v8nn"] Oct 03 15:06:31 crc kubenswrapper[4962]: E1003 15:06:31.495507 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="656d8f62-9bad-43c6-950e-29614affab2b" containerName="registry-server" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.495607 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="656d8f62-9bad-43c6-950e-29614affab2b" containerName="registry-server" Oct 03 15:06:31 crc kubenswrapper[4962]: E1003 15:06:31.495698 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="656d8f62-9bad-43c6-950e-29614affab2b" containerName="extract-content" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.495756 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="656d8f62-9bad-43c6-950e-29614affab2b" containerName="extract-content" Oct 03 15:06:31 crc kubenswrapper[4962]: E1003 15:06:31.495847 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="656d8f62-9bad-43c6-950e-29614affab2b" containerName="extract-utilities" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.495905 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="656d8f62-9bad-43c6-950e-29614affab2b" containerName="extract-utilities" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.496159 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="656d8f62-9bad-43c6-950e-29614affab2b" containerName="registry-server" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.498107 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.505482 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6v8nn"] Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.608567 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"aa48c1476389c502f3e31fce1f2bbf3e558dfcb2d299b8314600dcf50b479976"} Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.667687 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-utilities\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.667755 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8kj\" (UniqueName: \"kubernetes.io/projected/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-kube-api-access-8x8kj\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.667787 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-catalog-content\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.769246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-utilities\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.769573 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8kj\" (UniqueName: \"kubernetes.io/projected/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-kube-api-access-8x8kj\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.769692 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-catalog-content\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.769849 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-utilities\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.770155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-catalog-content\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.793974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8kj\" (UniqueName: \"kubernetes.io/projected/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-kube-api-access-8x8kj\") pod \"community-operators-6v8nn\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:31 crc kubenswrapper[4962]: I1003 15:06:31.820432 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:32 crc kubenswrapper[4962]: I1003 15:06:32.245453 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="656d8f62-9bad-43c6-950e-29614affab2b" path="/var/lib/kubelet/pods/656d8f62-9bad-43c6-950e-29614affab2b/volumes" Oct 03 15:06:34 crc kubenswrapper[4962]: I1003 15:06:34.936442 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6v8nn"] Oct 03 15:06:34 crc kubenswrapper[4962]: W1003 15:06:34.936817 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6281a74_70fb_4a33_aed8_0e5c6564b5d2.slice/crio-8775e9350324e2a43356886dafe0d827d98299359ae70ae3b6facbbd96771a34 WatchSource:0}: Error finding container 8775e9350324e2a43356886dafe0d827d98299359ae70ae3b6facbbd96771a34: Status 404 returned error can't find the container with id 8775e9350324e2a43356886dafe0d827d98299359ae70ae3b6facbbd96771a34 Oct 03 15:06:35 crc kubenswrapper[4962]: I1003 15:06:35.663317 4962 generic.go:334] "Generic (PLEG): container finished" podID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerID="333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328" exitCode=0 Oct 03 15:06:35 crc kubenswrapper[4962]: I1003 15:06:35.663402 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v8nn" event={"ID":"e6281a74-70fb-4a33-aed8-0e5c6564b5d2","Type":"ContainerDied","Data":"333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328"} Oct 03 15:06:35 crc kubenswrapper[4962]: I1003 15:06:35.663552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v8nn" event={"ID":"e6281a74-70fb-4a33-aed8-0e5c6564b5d2","Type":"ContainerStarted","Data":"8775e9350324e2a43356886dafe0d827d98299359ae70ae3b6facbbd96771a34"} Oct 03 15:06:37 crc kubenswrapper[4962]: I1003 15:06:37.686005 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v8nn" event={"ID":"e6281a74-70fb-4a33-aed8-0e5c6564b5d2","Type":"ContainerStarted","Data":"fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d"} Oct 03 15:06:39 crc kubenswrapper[4962]: I1003 15:06:39.706142 4962 generic.go:334] "Generic (PLEG): container finished" podID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerID="fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d" exitCode=0 Oct 03 15:06:39 crc kubenswrapper[4962]: I1003 15:06:39.707961 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v8nn" event={"ID":"e6281a74-70fb-4a33-aed8-0e5c6564b5d2","Type":"ContainerDied","Data":"fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d"} Oct 03 15:06:42 crc kubenswrapper[4962]: I1003 15:06:42.737838 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v8nn" event={"ID":"e6281a74-70fb-4a33-aed8-0e5c6564b5d2","Type":"ContainerStarted","Data":"6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b"} Oct 03 15:06:42 crc kubenswrapper[4962]: I1003 15:06:42.759563 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6v8nn" podStartSLOduration=7.267831015 podStartE2EDuration="11.759546597s" podCreationTimestamp="2025-10-03 15:06:31 +0000 UTC" firstStartedPulling="2025-10-03 15:06:35.665505262 +0000 UTC m=+8204.069403097" lastFinishedPulling="2025-10-03 15:06:40.157220844 +0000 UTC m=+8208.561118679" observedRunningTime="2025-10-03 15:06:42.751310248 +0000 UTC m=+8211.155208083" watchObservedRunningTime="2025-10-03 15:06:42.759546597 +0000 UTC m=+8211.163444422" Oct 03 15:06:51 crc kubenswrapper[4962]: I1003 15:06:51.821217 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:51 crc kubenswrapper[4962]: I1003 15:06:51.821773 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:51 crc kubenswrapper[4962]: I1003 15:06:51.876560 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:51 crc kubenswrapper[4962]: I1003 15:06:51.949105 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:52 crc kubenswrapper[4962]: I1003 15:06:52.120178 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6v8nn"] Oct 03 15:06:53 crc kubenswrapper[4962]: I1003 15:06:53.870691 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6v8nn" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerName="registry-server" containerID="cri-o://6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b" gracePeriod=2 Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.500064 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.544967 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x8kj\" (UniqueName: \"kubernetes.io/projected/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-kube-api-access-8x8kj\") pod \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.545081 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-catalog-content\") pod \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.545181 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-utilities\") pod \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\" (UID: \"e6281a74-70fb-4a33-aed8-0e5c6564b5d2\") " Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.546346 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-utilities" (OuterVolumeSpecName: "utilities") pod "e6281a74-70fb-4a33-aed8-0e5c6564b5d2" (UID: "e6281a74-70fb-4a33-aed8-0e5c6564b5d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.552057 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-kube-api-access-8x8kj" (OuterVolumeSpecName: "kube-api-access-8x8kj") pod "e6281a74-70fb-4a33-aed8-0e5c6564b5d2" (UID: "e6281a74-70fb-4a33-aed8-0e5c6564b5d2"). InnerVolumeSpecName "kube-api-access-8x8kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.605266 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6281a74-70fb-4a33-aed8-0e5c6564b5d2" (UID: "e6281a74-70fb-4a33-aed8-0e5c6564b5d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.648148 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x8kj\" (UniqueName: \"kubernetes.io/projected/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-kube-api-access-8x8kj\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.648182 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.648191 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6281a74-70fb-4a33-aed8-0e5c6564b5d2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.882271 4962 generic.go:334] "Generic (PLEG): container finished" podID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerID="6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b" exitCode=0 Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.882313 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v8nn" event={"ID":"e6281a74-70fb-4a33-aed8-0e5c6564b5d2","Type":"ContainerDied","Data":"6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b"} Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.882339 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v8nn" event={"ID":"e6281a74-70fb-4a33-aed8-0e5c6564b5d2","Type":"ContainerDied","Data":"8775e9350324e2a43356886dafe0d827d98299359ae70ae3b6facbbd96771a34"} Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.882356 4962 scope.go:117] "RemoveContainer" containerID="6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.882482 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v8nn" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.918988 4962 scope.go:117] "RemoveContainer" containerID="fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d" Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.923567 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6v8nn"] Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.933446 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6v8nn"] Oct 03 15:06:54 crc kubenswrapper[4962]: I1003 15:06:54.987846 4962 scope.go:117] "RemoveContainer" containerID="333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328" Oct 03 15:06:55 crc kubenswrapper[4962]: I1003 15:06:55.012717 4962 scope.go:117] "RemoveContainer" containerID="6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b" Oct 03 15:06:55 crc kubenswrapper[4962]: E1003 15:06:55.013264 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b\": container with ID starting with 6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b not found: ID does not exist" containerID="6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b" Oct 03 15:06:55 crc kubenswrapper[4962]: I1003 15:06:55.013296 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b"} err="failed to get container status \"6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b\": rpc error: code = NotFound desc = could not find container \"6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b\": container with ID starting with 6806b8481af7b1ffc79f1dd31f8e212e44834f26190e463ad800e85c36da3a2b not found: ID does not exist" Oct 03 15:06:55 crc kubenswrapper[4962]: I1003 15:06:55.013318 4962 scope.go:117] "RemoveContainer" containerID="fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d" Oct 03 15:06:55 crc kubenswrapper[4962]: E1003 15:06:55.013548 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d\": container with ID starting with fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d not found: ID does not exist" containerID="fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d" Oct 03 15:06:55 crc kubenswrapper[4962]: I1003 15:06:55.013566 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d"} err="failed to get container status \"fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d\": rpc error: code = NotFound desc = could not find container \"fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d\": container with ID starting with fb9196e3eb05a05e7df12d6976551ea5962a4177ea14b1a794879cb35321fa2d not found: ID does not exist" Oct 03 15:06:55 crc kubenswrapper[4962]: I1003 15:06:55.013581 4962 scope.go:117] "RemoveContainer" containerID="333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328" Oct 03 15:06:55 crc kubenswrapper[4962]: E1003 15:06:55.013907 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328\": container with ID starting with 333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328 not found: ID does not exist" containerID="333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328" Oct 03 15:06:55 crc kubenswrapper[4962]: I1003 15:06:55.013926 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328"} err="failed to get container status \"333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328\": rpc error: code = NotFound desc = could not find container \"333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328\": container with ID starting with 333be3467edf3ca6ed2cc806f278552ff63531b6d4b0e2a1f3e5eeec0805b328 not found: ID does not exist" Oct 03 15:06:56 crc kubenswrapper[4962]: I1003 15:06:56.243279 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" path="/var/lib/kubelet/pods/e6281a74-70fb-4a33-aed8-0e5c6564b5d2/volumes" Oct 03 15:07:35 crc kubenswrapper[4962]: I1003 15:07:35.762864 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_147c89e8-3065-48f5-ae75-cb029cd4f447/init-config-reloader/0.log" Oct 03 15:07:35 crc kubenswrapper[4962]: I1003 15:07:35.955225 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_147c89e8-3065-48f5-ae75-cb029cd4f447/init-config-reloader/0.log" Oct 03 15:07:36 crc kubenswrapper[4962]: I1003 15:07:36.040821 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_147c89e8-3065-48f5-ae75-cb029cd4f447/alertmanager/0.log" Oct 03 15:07:36 crc kubenswrapper[4962]: I1003 15:07:36.138434 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_147c89e8-3065-48f5-ae75-cb029cd4f447/config-reloader/0.log" Oct 03 15:07:36 crc kubenswrapper[4962]: I1003 15:07:36.373786 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_57e8140d-8b35-48b0-a27c-4f1279c29f5c/aodh-api/0.log" Oct 03 15:07:36 crc kubenswrapper[4962]: I1003 15:07:36.530787 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_57e8140d-8b35-48b0-a27c-4f1279c29f5c/aodh-evaluator/0.log" Oct 03 15:07:36 crc kubenswrapper[4962]: I1003 15:07:36.588431 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_57e8140d-8b35-48b0-a27c-4f1279c29f5c/aodh-listener/0.log" Oct 03 15:07:36 crc kubenswrapper[4962]: I1003 15:07:36.737693 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_57e8140d-8b35-48b0-a27c-4f1279c29f5c/aodh-notifier/0.log" Oct 03 15:07:36 crc kubenswrapper[4962]: I1003 15:07:36.919523 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6dfcdc79bb-5d7lt_bb24c6d5-37e0-46ab-9a4b-1a19f3c77110/barbican-api/0.log" Oct 03 15:07:37 crc kubenswrapper[4962]: I1003 15:07:37.080751 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6dfcdc79bb-5d7lt_bb24c6d5-37e0-46ab-9a4b-1a19f3c77110/barbican-api-log/0.log" Oct 03 15:07:37 crc kubenswrapper[4962]: I1003 15:07:37.265770 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56b6f686d6-9ntzw_cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf/barbican-keystone-listener/0.log" Oct 03 15:07:37 crc kubenswrapper[4962]: I1003 15:07:37.386165 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56b6f686d6-9ntzw_cf3b7d67-1f8c-48f6-b2bb-d5652a5129cf/barbican-keystone-listener-log/0.log" Oct 03 15:07:37 crc kubenswrapper[4962]: I1003 15:07:37.578026 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-798f8c695f-slvjr_7084aec7-12fb-401b-b866-4066ebe0e546/barbican-worker/0.log" Oct 03 15:07:37 crc kubenswrapper[4962]: I1003 15:07:37.729047 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-798f8c695f-slvjr_7084aec7-12fb-401b-b866-4066ebe0e546/barbican-worker-log/0.log" Oct 03 15:07:37 crc kubenswrapper[4962]: I1003 15:07:37.987741 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-wvgsq_497c3d70-7959-49ea-9acb-d8bd2f301d0a/bootstrap-openstack-openstack-cell1/0.log" Oct 03 15:07:38 crc kubenswrapper[4962]: I1003 15:07:38.112745 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e4905874-9e48-44e5-9c3d-e5e10844a4b2/ceilometer-central-agent/0.log" Oct 03 15:07:38 crc kubenswrapper[4962]: I1003 15:07:38.238407 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e4905874-9e48-44e5-9c3d-e5e10844a4b2/ceilometer-notification-agent/0.log" Oct 03 15:07:38 crc kubenswrapper[4962]: I1003 15:07:38.284841 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e4905874-9e48-44e5-9c3d-e5e10844a4b2/proxy-httpd/0.log" Oct 03 15:07:38 crc kubenswrapper[4962]: I1003 15:07:38.365929 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e4905874-9e48-44e5-9c3d-e5e10844a4b2/sg-core/0.log" Oct 03 15:07:38 crc kubenswrapper[4962]: I1003 15:07:38.490842 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-xr76n_9bdd019a-3501-490b-96cb-67a841f833c1/ceph-client-openstack-openstack-cell1/0.log" Oct 03 15:07:38 crc kubenswrapper[4962]: I1003 15:07:38.983856 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_63a4bfb1-9ada-4b7f-9ebb-a77a028ec821/cinder-api/0.log" Oct 03 15:07:39 crc kubenswrapper[4962]: I1003 15:07:39.037889 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_63a4bfb1-9ada-4b7f-9ebb-a77a028ec821/cinder-api-log/0.log" Oct 03 15:07:39 crc kubenswrapper[4962]: I1003 15:07:39.356389 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_95cd6540-175f-480c-98b9-4864c792528f/probe/0.log" Oct 03 15:07:39 crc kubenswrapper[4962]: I1003 15:07:39.389056 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_95cd6540-175f-480c-98b9-4864c792528f/cinder-backup/0.log" Oct 03 15:07:39 crc kubenswrapper[4962]: I1003 15:07:39.601779 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_74f65f90-9944-4bb5-a7dd-4f8fd8781be1/cinder-scheduler/0.log" Oct 03 15:07:39 crc kubenswrapper[4962]: I1003 15:07:39.669163 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_74f65f90-9944-4bb5-a7dd-4f8fd8781be1/probe/0.log" Oct 03 15:07:39 crc kubenswrapper[4962]: I1003 15:07:39.806137 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_fef1b0d4-c55c-4ee8-8e5a-e7da5696c273/cinder-volume/0.log" Oct 03 15:07:39 crc kubenswrapper[4962]: I1003 15:07:39.917663 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_fef1b0d4-c55c-4ee8-8e5a-e7da5696c273/probe/0.log" Oct 03 15:07:40 crc kubenswrapper[4962]: I1003 15:07:40.023854 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-xf5xb_3fe030b0-f58c-4b14-b76f-93c6724a902b/configure-network-openstack-openstack-cell1/0.log" Oct 03 15:07:40 crc kubenswrapper[4962]: I1003 15:07:40.175144 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-ftl9x_bdffe428-c92a-4343-9f35-fd522846891a/configure-os-openstack-openstack-cell1/0.log" Oct 03 15:07:40 crc kubenswrapper[4962]: I1003 15:07:40.346043 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5b449d676c-2nhlt_57a2d47f-f545-4930-8fa1-4833e0b03da3/init/0.log" Oct 03 15:07:40 crc kubenswrapper[4962]: I1003 15:07:40.496461 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5b449d676c-2nhlt_57a2d47f-f545-4930-8fa1-4833e0b03da3/init/0.log" Oct 03 15:07:40 crc kubenswrapper[4962]: I1003 15:07:40.540075 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5b449d676c-2nhlt_57a2d47f-f545-4930-8fa1-4833e0b03da3/dnsmasq-dns/0.log" Oct 03 15:07:40 crc kubenswrapper[4962]: I1003 15:07:40.709781 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-kv9sn_1ea37c03-872a-453b-8f55-42d377fa11ad/download-cache-openstack-openstack-cell1/0.log" Oct 03 15:07:40 crc kubenswrapper[4962]: I1003 15:07:40.821669 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_43335121-1384-48d5-b2e8-ba845364de87/glance-httpd/0.log" Oct 03 15:07:40 crc kubenswrapper[4962]: I1003 15:07:40.865169 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_43335121-1384-48d5-b2e8-ba845364de87/glance-log/0.log" Oct 03 15:07:41 crc kubenswrapper[4962]: I1003 15:07:41.037102 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1afb0a2b-e861-4676-a5a6-f762c65ac044/glance-log/0.log" Oct 03 15:07:41 crc kubenswrapper[4962]: I1003 15:07:41.037358 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1afb0a2b-e861-4676-a5a6-f762c65ac044/glance-httpd/0.log" Oct 03 15:07:41 crc kubenswrapper[4962]: I1003 15:07:41.276481 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-69556c5889-9hbl8_c42e9aa3-5ec6-4434-8e77-14ed49101590/heat-api/0.log" Oct 03 15:07:41 crc kubenswrapper[4962]: I1003 15:07:41.458028 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-dc84b6f66-2tzcd_43f66771-9c85-4e51-81c8-ff54001c8702/heat-cfnapi/0.log" Oct 03 15:07:41 crc kubenswrapper[4962]: I1003 15:07:41.556416 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-77d94b4755-qkln9_89db9039-38fa-408a-b07b-7fdd4f55c6bd/heat-engine/0.log" Oct 03 15:07:41 crc kubenswrapper[4962]: I1003 15:07:41.771873 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-675fb576f7-5q7qf_9eeb086e-a213-41d6-af7c-592d92f6feec/horizon/0.log" Oct 03 15:07:41 crc kubenswrapper[4962]: I1003 15:07:41.841461 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-675fb576f7-5q7qf_9eeb086e-a213-41d6-af7c-592d92f6feec/horizon-log/0.log" Oct 03 15:07:41 crc kubenswrapper[4962]: I1003 15:07:41.961890 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-ls7cn_eb12d174-0920-408e-aa2f-ff2f07a1e005/install-certs-openstack-openstack-cell1/0.log" Oct 03 15:07:42 crc kubenswrapper[4962]: I1003 15:07:42.153445 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-6svh8_025f3d9b-acea-44a1-9923-c0a471aedba2/install-os-openstack-openstack-cell1/0.log" Oct 03 15:07:42 crc kubenswrapper[4962]: I1003 15:07:42.405897 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29325061-brkz8_4ec33b45-7142-49fb-9f47-719b72891dc1/keystone-cron/0.log" Oct 03 15:07:42 crc kubenswrapper[4962]: I1003 15:07:42.422157 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5bcc869749-z5gsq_4d25d167-3bc9-436d-86ad-36da4b9a2a88/keystone-api/0.log" Oct 03 15:07:42 crc kubenswrapper[4962]: I1003 15:07:42.634775 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0444f80d-1176-4ee9-963d-b3d29af93dea/kube-state-metrics/0.log" Oct 03 15:07:42 crc kubenswrapper[4962]: I1003 15:07:42.826478 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_46631f63-74e6-4de8-9b05-38d7e110e604/manila-api-log/0.log" Oct 03 15:07:42 crc kubenswrapper[4962]: I1003 15:07:42.945243 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_46631f63-74e6-4de8-9b05-38d7e110e604/manila-api/0.log" Oct 03 15:07:43 crc kubenswrapper[4962]: I1003 15:07:43.105355 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1c57bd1b-33d0-464b-ac4e-204e90cebd48/manila-scheduler/0.log" Oct 03 15:07:43 crc kubenswrapper[4962]: I1003 15:07:43.152652 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1c57bd1b-33d0-464b-ac4e-204e90cebd48/probe/0.log" Oct 03 15:07:43 crc kubenswrapper[4962]: I1003 15:07:43.341132 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_1b5b9fea-e568-4955-ae40-e3fb2cc52743/probe/0.log" Oct 03 15:07:43 crc kubenswrapper[4962]: I1003 15:07:43.367557 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_1b5b9fea-e568-4955-ae40-e3fb2cc52743/manila-share/0.log" Oct 03 15:07:43 crc kubenswrapper[4962]: I1003 15:07:43.513090 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_0454ec44-3669-4a32-b308-104a30554a7d/adoption/0.log" Oct 03 15:07:43 crc kubenswrapper[4962]: I1003 15:07:43.939312 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6bf5469677-7cw9c_fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3/neutron-api/0.log" Oct 03 15:07:44 crc kubenswrapper[4962]: I1003 15:07:44.027746 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6bf5469677-7cw9c_fd0baaee-01a4-4b74-8b39-cbcc5aaba7c3/neutron-httpd/0.log" Oct 03 15:07:44 crc kubenswrapper[4962]: I1003 15:07:44.362734 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ef39afc8-f9ff-4a08-8839-4271da194934/nova-api-api/0.log" Oct 03 15:07:44 crc kubenswrapper[4962]: I1003 15:07:44.539115 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ef39afc8-f9ff-4a08-8839-4271da194934/nova-api-log/0.log" Oct 03 15:07:44 crc kubenswrapper[4962]: I1003 15:07:44.843911 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a1914699-3c0e-42ac-b63e-7df14da9703d/nova-cell0-conductor-conductor/0.log" Oct 03 15:07:45 crc kubenswrapper[4962]: I1003 15:07:45.205025 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_690ab65f-bd9a-48d4-90f7-245ddeafc5da/nova-cell1-conductor-conductor/0.log" Oct 03 15:07:45 crc kubenswrapper[4962]: I1003 15:07:45.357694 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_971e90e7-4cf4-4259-96f9-4ddd7aaec693/memcached/0.log" Oct 03 15:07:45 crc kubenswrapper[4962]: I1003 15:07:45.437164 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6c3dd24d-75e3-422a-ac9a-efeee6404f74/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 15:07:45 crc kubenswrapper[4962]: I1003 15:07:45.584891 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3a699aa9-c62b-4255-ac30-3d9e57fa9905/nova-metadata-log/0.log" Oct 03 15:07:45 crc kubenswrapper[4962]: I1003 15:07:45.698907 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3a699aa9-c62b-4255-ac30-3d9e57fa9905/nova-metadata-metadata/0.log" Oct 03 15:07:45 crc kubenswrapper[4962]: I1003 15:07:45.812928 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_002ef092-a0a8-4a68-92aa-36a23acb13ee/nova-scheduler-scheduler/0.log" Oct 03 15:07:45 crc kubenswrapper[4962]: I1003 15:07:45.935988 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6f6fb44c8b-xbxhh_685d50a6-2ad8-4462-9526-193412684ac5/init/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.082130 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6f6fb44c8b-xbxhh_685d50a6-2ad8-4462-9526-193412684ac5/init/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.187368 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6f6fb44c8b-xbxhh_685d50a6-2ad8-4462-9526-193412684ac5/octavia-api-provider-agent/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.258355 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6f6fb44c8b-xbxhh_685d50a6-2ad8-4462-9526-193412684ac5/octavia-api/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.379480 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-t4b68_e1d69655-2430-4062-a2d8-da19a77ebd4a/init/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.590321 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-t4b68_e1d69655-2430-4062-a2d8-da19a77ebd4a/init/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.613938 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-t4b68_e1d69655-2430-4062-a2d8-da19a77ebd4a/octavia-healthmanager/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.778287 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-kkdhs_cc005e9c-8257-4def-8acb-ef953aa375c4/init/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.911506 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-kkdhs_cc005e9c-8257-4def-8acb-ef953aa375c4/octavia-housekeeping/0.log" Oct 03 15:07:46 crc kubenswrapper[4962]: I1003 15:07:46.942285 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-kkdhs_cc005e9c-8257-4def-8acb-ef953aa375c4/init/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.098095 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-flpcl_a90613f1-d29b-41eb-b925-f28918fbcd2b/init/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.246439 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-flpcl_a90613f1-d29b-41eb-b925-f28918fbcd2b/init/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.253770 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-flpcl_a90613f1-d29b-41eb-b925-f28918fbcd2b/octavia-rsyslog/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.405792 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dk769_dec89ffb-653b-42d8-a160-fc87029742f7/init/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.574118 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dk769_dec89ffb-653b-42d8-a160-fc87029742f7/init/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.755716 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dk769_dec89ffb-653b-42d8-a160-fc87029742f7/octavia-worker/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.787994 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_19d8d7bb-0299-4cc9-95d6-956eec32d04a/mysql-bootstrap/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.966576 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_19d8d7bb-0299-4cc9-95d6-956eec32d04a/mysql-bootstrap/0.log" Oct 03 15:07:47 crc kubenswrapper[4962]: I1003 15:07:47.980377 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_19d8d7bb-0299-4cc9-95d6-956eec32d04a/galera/0.log" Oct 03 15:07:48 crc kubenswrapper[4962]: I1003 15:07:48.228449 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f3e7cb42-0fc6-4aac-aada-41b2f760b5e8/mysql-bootstrap/0.log" Oct 03 15:07:48 crc kubenswrapper[4962]: I1003 15:07:48.339747 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f3e7cb42-0fc6-4aac-aada-41b2f760b5e8/mysql-bootstrap/0.log" Oct 03 15:07:48 crc kubenswrapper[4962]: I1003 15:07:48.347688 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f3e7cb42-0fc6-4aac-aada-41b2f760b5e8/galera/0.log" Oct 03 15:07:48 crc kubenswrapper[4962]: I1003 15:07:48.511922 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4f6ca686-77e2-4a7b-a8c7-bdc02f5f8812/openstackclient/0.log" Oct 03 15:07:48 crc kubenswrapper[4962]: I1003 15:07:48.615603 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-bbqrf_1c5fddb0-c21c-499d-b4c4-61a807e0c392/ovn-controller/0.log" Oct 03 15:07:48 crc kubenswrapper[4962]: I1003 15:07:48.789292 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-s5l2q_fec5fe67-1a3c-4be8-bd47-7fe5597e9399/openstack-network-exporter/0.log" Oct 03 15:07:48 crc kubenswrapper[4962]: I1003 15:07:48.908179 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p7pqn_fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed/ovsdb-server-init/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.051189 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p7pqn_fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed/ovsdb-server-init/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.073993 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p7pqn_fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed/ovs-vswitchd/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.084898 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p7pqn_fbeda2b8-4f84-4dc1-b2d6-d59d604ffaed/ovsdb-server/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.222793 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_ffd6efcb-6de8-451d-be3a-b5af7aa5f986/adoption/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.394076 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_abc499fc-1616-4f38-b181-7c26bb38b71a/openstack-network-exporter/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.466506 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_abc499fc-1616-4f38-b181-7c26bb38b71a/ovn-northd/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.600945 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-7w5ds_0b1c4f80-f94d-401d-8b25-e89a06e993b4/ovn-openstack-openstack-cell1/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.745506 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-m76lg_811b7680-d11b-4c89-a8bd-d217aa5226ac/ovn-openstack-openstack-cell1/0.log" Oct 03 15:07:49 crc kubenswrapper[4962]: I1003 15:07:49.946447 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-rh2r4_b6f674e6-6785-465e-b8ba-85c0efe46fe5/ovn-openstack-openstack-cell1/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.237198 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-vrkcj_1a534445-267b-424d-bc4e-67a17558f09a/ovn-openstack-openstack-cell1/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.348044 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c/openstack-network-exporter/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.449919 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8c1870ea-e7e7-4eb0-9ce9-d6ec1b9be43c/ovsdbserver-nb/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.528508 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_00d0685e-721c-4362-8758-bb6f4d558db1/openstack-network-exporter/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.620744 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_00d0685e-721c-4362-8758-bb6f4d558db1/ovsdbserver-nb/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.758194 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_e678ef37-68d9-467f-81ec-bcd62272c6b2/openstack-network-exporter/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.788909 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_e678ef37-68d9-467f-81ec-bcd62272c6b2/ovsdbserver-nb/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.944921 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e2639e05-a11a-4b8b-8042-462df3d59df7/openstack-network-exporter/0.log" Oct 03 15:07:50 crc kubenswrapper[4962]: I1003 15:07:50.958580 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e2639e05-a11a-4b8b-8042-462df3d59df7/ovsdbserver-sb/0.log" Oct 03 15:07:51 crc kubenswrapper[4962]: I1003 15:07:51.121190 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_5cef6c4f-828f-4438-82a5-c1d42a7624a8/openstack-network-exporter/0.log" Oct 03 15:07:51 crc kubenswrapper[4962]: I1003 15:07:51.163863 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_5cef6c4f-828f-4438-82a5-c1d42a7624a8/ovsdbserver-sb/0.log" Oct 03 15:07:51 crc kubenswrapper[4962]: I1003 15:07:51.329045 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_b64491d9-7298-4635-883b-0e20686dd5a4/openstack-network-exporter/0.log" Oct 03 15:07:51 crc kubenswrapper[4962]: I1003 15:07:51.357762 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_b64491d9-7298-4635-883b-0e20686dd5a4/ovsdbserver-sb/0.log" Oct 03 15:07:51 crc kubenswrapper[4962]: I1003 15:07:51.567777 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58c9cc65fb-rvbds_6c9b090e-aaa5-4d35-a951-6f1628b6c018/placement-api/0.log" Oct 03 15:07:51 crc kubenswrapper[4962]: I1003 15:07:51.585181 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58c9cc65fb-rvbds_6c9b090e-aaa5-4d35-a951-6f1628b6c018/placement-log/0.log" Oct 03 15:07:51 crc kubenswrapper[4962]: I1003 15:07:51.730657 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c6zxnw_7565767b-b366-48c0-a4ed-5244d16e32a1/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 03 15:07:51 crc kubenswrapper[4962]: I1003 15:07:51.920081 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_44378bf5-5a0a-4e9f-98c9-7bd8b42387da/init-config-reloader/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.094423 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_44378bf5-5a0a-4e9f-98c9-7bd8b42387da/init-config-reloader/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.095453 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_44378bf5-5a0a-4e9f-98c9-7bd8b42387da/config-reloader/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.128025 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_44378bf5-5a0a-4e9f-98c9-7bd8b42387da/prometheus/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.323676 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_44378bf5-5a0a-4e9f-98c9-7bd8b42387da/thanos-sidecar/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.373739 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748862b5-56ab-4601-bdd3-2c7d825f6c96/setup-container/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.524683 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748862b5-56ab-4601-bdd3-2c7d825f6c96/setup-container/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.524913 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748862b5-56ab-4601-bdd3-2c7d825f6c96/rabbitmq/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.706613 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9b27973f-0013-497d-8e30-0f94ffb4e651/setup-container/0.log" Oct 03 15:07:52 crc kubenswrapper[4962]: I1003 15:07:52.857547 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9b27973f-0013-497d-8e30-0f94ffb4e651/setup-container/0.log" Oct 03 15:07:53 crc kubenswrapper[4962]: I1003 15:07:53.084596 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-mvkcw_11c2a64a-42ac-44e7-a6f9-9af9798ce53b/reboot-os-openstack-openstack-cell1/0.log" Oct 03 15:07:53 crc kubenswrapper[4962]: I1003 15:07:53.325875 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-44p8j_7e32ac65-a775-42ea-9693-20cdd36084cd/run-os-openstack-openstack-cell1/0.log" Oct 03 15:07:53 crc kubenswrapper[4962]: I1003 15:07:53.577249 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-jm55c_bd52771d-d70b-4fe5-a1f2-b4347bbf5c15/ssh-known-hosts-openstack/0.log" Oct 03 15:07:53 crc kubenswrapper[4962]: I1003 15:07:53.954284 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-bblfh_429b4a0d-c23b-4c71-8844-7ec3cd482a4f/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 03 15:07:54 crc kubenswrapper[4962]: I1003 15:07:54.232481 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-6668s_ae303f19-3403-4fc8-93fd-fcb48745e42f/validate-network-openstack-openstack-cell1/0.log" Oct 03 15:07:54 crc kubenswrapper[4962]: I1003 15:07:54.326963 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9b27973f-0013-497d-8e30-0f94ffb4e651/rabbitmq/0.log" Oct 03 15:08:41 crc kubenswrapper[4962]: I1003 15:08:41.045477 4962 generic.go:334] "Generic (PLEG): container finished" podID="0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944" containerID="603a053ac2208b0c97c5b4e82974ebc5c17ba4834399cb88065d571d500a909b" exitCode=0 Oct 03 15:08:41 crc kubenswrapper[4962]: I1003 15:08:41.045588 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-mll9k" event={"ID":"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944","Type":"ContainerDied","Data":"603a053ac2208b0c97c5b4e82974ebc5c17ba4834399cb88065d571d500a909b"} Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.183623 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.221458 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-mll9k"] Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.241953 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-mll9k"] Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.262737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-host\") pod \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\" (UID: \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\") " Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.262935 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-host" (OuterVolumeSpecName: "host") pod "0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944" (UID: "0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.263064 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58wqs\" (UniqueName: \"kubernetes.io/projected/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-kube-api-access-58wqs\") pod \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\" (UID: \"0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944\") " Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.263584 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.271825 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-kube-api-access-58wqs" (OuterVolumeSpecName: "kube-api-access-58wqs") pod "0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944" (UID: "0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944"). InnerVolumeSpecName "kube-api-access-58wqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:08:42 crc kubenswrapper[4962]: I1003 15:08:42.365462 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58wqs\" (UniqueName: \"kubernetes.io/projected/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944-kube-api-access-58wqs\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.064460 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3584439fe8bce882041ddeb4b799b4a3f447f33d2c685c898782cbcaa3f65f2f" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.064506 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-mll9k" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.384356 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-htbmh"] Oct 03 15:08:43 crc kubenswrapper[4962]: E1003 15:08:43.384791 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerName="extract-content" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.384804 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerName="extract-content" Oct 03 15:08:43 crc kubenswrapper[4962]: E1003 15:08:43.384832 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944" containerName="container-00" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.384837 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944" containerName="container-00" Oct 03 15:08:43 crc kubenswrapper[4962]: E1003 15:08:43.384857 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerName="registry-server" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.384863 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerName="registry-server" Oct 03 15:08:43 crc kubenswrapper[4962]: E1003 15:08:43.384873 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerName="extract-utilities" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.384879 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerName="extract-utilities" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.385078 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944" containerName="container-00" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.385098 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6281a74-70fb-4a33-aed8-0e5c6564b5d2" containerName="registry-server" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.385895 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.493302 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78v7h\" (UniqueName: \"kubernetes.io/projected/e6031b6b-fd7f-4d98-81cf-1156c8546191-kube-api-access-78v7h\") pod \"crc-debug-htbmh\" (UID: \"e6031b6b-fd7f-4d98-81cf-1156c8546191\") " pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.493797 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6031b6b-fd7f-4d98-81cf-1156c8546191-host\") pod \"crc-debug-htbmh\" (UID: \"e6031b6b-fd7f-4d98-81cf-1156c8546191\") " pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.595729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78v7h\" (UniqueName: \"kubernetes.io/projected/e6031b6b-fd7f-4d98-81cf-1156c8546191-kube-api-access-78v7h\") pod \"crc-debug-htbmh\" (UID: \"e6031b6b-fd7f-4d98-81cf-1156c8546191\") " pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.595865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6031b6b-fd7f-4d98-81cf-1156c8546191-host\") pod \"crc-debug-htbmh\" (UID: \"e6031b6b-fd7f-4d98-81cf-1156c8546191\") " pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.596022 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6031b6b-fd7f-4d98-81cf-1156c8546191-host\") pod \"crc-debug-htbmh\" (UID: \"e6031b6b-fd7f-4d98-81cf-1156c8546191\") " pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.612146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78v7h\" (UniqueName: \"kubernetes.io/projected/e6031b6b-fd7f-4d98-81cf-1156c8546191-kube-api-access-78v7h\") pod \"crc-debug-htbmh\" (UID: \"e6031b6b-fd7f-4d98-81cf-1156c8546191\") " pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:43 crc kubenswrapper[4962]: I1003 15:08:43.711554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:44 crc kubenswrapper[4962]: I1003 15:08:44.078733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" event={"ID":"e6031b6b-fd7f-4d98-81cf-1156c8546191","Type":"ContainerStarted","Data":"f1f4a896cbb5208d12921af80262779d700c1599ac51ba345ba19548c6866232"} Oct 03 15:08:44 crc kubenswrapper[4962]: I1003 15:08:44.079205 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" event={"ID":"e6031b6b-fd7f-4d98-81cf-1156c8546191","Type":"ContainerStarted","Data":"60c30d08dd40cf0ab49aa4f0799e0bf6796e404c8f68a863a265c1911d258b4e"} Oct 03 15:08:44 crc kubenswrapper[4962]: I1003 15:08:44.099076 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" podStartSLOduration=1.099051341 podStartE2EDuration="1.099051341s" podCreationTimestamp="2025-10-03 15:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:08:44.092663191 +0000 UTC m=+8332.496561026" watchObservedRunningTime="2025-10-03 15:08:44.099051341 +0000 UTC m=+8332.502949176" Oct 03 15:08:44 crc kubenswrapper[4962]: I1003 15:08:44.250352 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944" path="/var/lib/kubelet/pods/0c6f6f8e-52f8-49b3-be6b-1fbef7eb5944/volumes" Oct 03 15:08:45 crc kubenswrapper[4962]: I1003 15:08:45.088537 4962 generic.go:334] "Generic (PLEG): container finished" podID="e6031b6b-fd7f-4d98-81cf-1156c8546191" containerID="f1f4a896cbb5208d12921af80262779d700c1599ac51ba345ba19548c6866232" exitCode=0 Oct 03 15:08:45 crc kubenswrapper[4962]: I1003 15:08:45.088588 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" event={"ID":"e6031b6b-fd7f-4d98-81cf-1156c8546191","Type":"ContainerDied","Data":"f1f4a896cbb5208d12921af80262779d700c1599ac51ba345ba19548c6866232"} Oct 03 15:08:46 crc kubenswrapper[4962]: I1003 15:08:46.210500 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:46 crc kubenswrapper[4962]: I1003 15:08:46.241647 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78v7h\" (UniqueName: \"kubernetes.io/projected/e6031b6b-fd7f-4d98-81cf-1156c8546191-kube-api-access-78v7h\") pod \"e6031b6b-fd7f-4d98-81cf-1156c8546191\" (UID: \"e6031b6b-fd7f-4d98-81cf-1156c8546191\") " Oct 03 15:08:46 crc kubenswrapper[4962]: I1003 15:08:46.241988 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6031b6b-fd7f-4d98-81cf-1156c8546191-host\") pod \"e6031b6b-fd7f-4d98-81cf-1156c8546191\" (UID: \"e6031b6b-fd7f-4d98-81cf-1156c8546191\") " Oct 03 15:08:46 crc kubenswrapper[4962]: I1003 15:08:46.242456 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6031b6b-fd7f-4d98-81cf-1156c8546191-host" (OuterVolumeSpecName: "host") pod "e6031b6b-fd7f-4d98-81cf-1156c8546191" (UID: "e6031b6b-fd7f-4d98-81cf-1156c8546191"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:08:46 crc kubenswrapper[4962]: I1003 15:08:46.270055 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6031b6b-fd7f-4d98-81cf-1156c8546191-kube-api-access-78v7h" (OuterVolumeSpecName: "kube-api-access-78v7h") pod "e6031b6b-fd7f-4d98-81cf-1156c8546191" (UID: "e6031b6b-fd7f-4d98-81cf-1156c8546191"). InnerVolumeSpecName "kube-api-access-78v7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:08:46 crc kubenswrapper[4962]: I1003 15:08:46.344903 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6031b6b-fd7f-4d98-81cf-1156c8546191-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:46 crc kubenswrapper[4962]: I1003 15:08:46.344932 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78v7h\" (UniqueName: \"kubernetes.io/projected/e6031b6b-fd7f-4d98-81cf-1156c8546191-kube-api-access-78v7h\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:47 crc kubenswrapper[4962]: I1003 15:08:47.108723 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" event={"ID":"e6031b6b-fd7f-4d98-81cf-1156c8546191","Type":"ContainerDied","Data":"60c30d08dd40cf0ab49aa4f0799e0bf6796e404c8f68a863a265c1911d258b4e"} Oct 03 15:08:47 crc kubenswrapper[4962]: I1003 15:08:47.108764 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c30d08dd40cf0ab49aa4f0799e0bf6796e404c8f68a863a265c1911d258b4e" Oct 03 15:08:47 crc kubenswrapper[4962]: I1003 15:08:47.108794 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-htbmh" Oct 03 15:08:53 crc kubenswrapper[4962]: I1003 15:08:53.820132 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-htbmh"] Oct 03 15:08:53 crc kubenswrapper[4962]: I1003 15:08:53.831925 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-htbmh"] Oct 03 15:08:54 crc kubenswrapper[4962]: I1003 15:08:54.240232 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6031b6b-fd7f-4d98-81cf-1156c8546191" path="/var/lib/kubelet/pods/e6031b6b-fd7f-4d98-81cf-1156c8546191/volumes" Oct 03 15:08:54 crc kubenswrapper[4962]: I1003 15:08:54.660214 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:08:54 crc kubenswrapper[4962]: I1003 15:08:54.660507 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.001242 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-dcvdq"] Oct 03 15:08:55 crc kubenswrapper[4962]: E1003 15:08:55.001702 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6031b6b-fd7f-4d98-81cf-1156c8546191" containerName="container-00" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.001714 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6031b6b-fd7f-4d98-81cf-1156c8546191" containerName="container-00" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.001923 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6031b6b-fd7f-4d98-81cf-1156c8546191" containerName="container-00" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.002622 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.124488 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-host\") pod \"crc-debug-dcvdq\" (UID: \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\") " pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.125243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c4bg\" (UniqueName: \"kubernetes.io/projected/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-kube-api-access-7c4bg\") pod \"crc-debug-dcvdq\" (UID: \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\") " pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.227941 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c4bg\" (UniqueName: \"kubernetes.io/projected/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-kube-api-access-7c4bg\") pod \"crc-debug-dcvdq\" (UID: \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\") " pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.228103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-host\") pod \"crc-debug-dcvdq\" (UID: \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\") " pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.228229 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-host\") pod \"crc-debug-dcvdq\" (UID: \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\") " pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.251490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c4bg\" (UniqueName: \"kubernetes.io/projected/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-kube-api-access-7c4bg\") pod \"crc-debug-dcvdq\" (UID: \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\") " pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:55 crc kubenswrapper[4962]: I1003 15:08:55.322335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:56 crc kubenswrapper[4962]: I1003 15:08:56.188985 4962 generic.go:334] "Generic (PLEG): container finished" podID="5b7c3d77-aa59-40b3-9a83-4469787e8f5b" containerID="5a06cb0fe031f724133baa744a317a1e995c543a9edc030422ce18e1868a6c67" exitCode=0 Oct 03 15:08:56 crc kubenswrapper[4962]: I1003 15:08:56.189104 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" event={"ID":"5b7c3d77-aa59-40b3-9a83-4469787e8f5b","Type":"ContainerDied","Data":"5a06cb0fe031f724133baa744a317a1e995c543a9edc030422ce18e1868a6c67"} Oct 03 15:08:56 crc kubenswrapper[4962]: I1003 15:08:56.189339 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" event={"ID":"5b7c3d77-aa59-40b3-9a83-4469787e8f5b","Type":"ContainerStarted","Data":"76b8d89e03efe87fcb516c1930323c45f0f1f4fdd6378b9af8dc69b60d51574e"} Oct 03 15:08:56 crc kubenswrapper[4962]: I1003 15:08:56.249357 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-dcvdq"] Oct 03 15:08:56 crc kubenswrapper[4962]: I1003 15:08:56.273967 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qgjqc/crc-debug-dcvdq"] Oct 03 15:08:57 crc kubenswrapper[4962]: I1003 15:08:57.298273 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:57 crc kubenswrapper[4962]: I1003 15:08:57.376702 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4bg\" (UniqueName: \"kubernetes.io/projected/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-kube-api-access-7c4bg\") pod \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\" (UID: \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\") " Oct 03 15:08:57 crc kubenswrapper[4962]: I1003 15:08:57.376992 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-host\") pod \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\" (UID: \"5b7c3d77-aa59-40b3-9a83-4469787e8f5b\") " Oct 03 15:08:57 crc kubenswrapper[4962]: I1003 15:08:57.377136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-host" (OuterVolumeSpecName: "host") pod "5b7c3d77-aa59-40b3-9a83-4469787e8f5b" (UID: "5b7c3d77-aa59-40b3-9a83-4469787e8f5b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:08:57 crc kubenswrapper[4962]: I1003 15:08:57.377603 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:57 crc kubenswrapper[4962]: I1003 15:08:57.382720 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-kube-api-access-7c4bg" (OuterVolumeSpecName: "kube-api-access-7c4bg") pod "5b7c3d77-aa59-40b3-9a83-4469787e8f5b" (UID: "5b7c3d77-aa59-40b3-9a83-4469787e8f5b"). InnerVolumeSpecName "kube-api-access-7c4bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:08:57 crc kubenswrapper[4962]: I1003 15:08:57.479492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4bg\" (UniqueName: \"kubernetes.io/projected/5b7c3d77-aa59-40b3-9a83-4469787e8f5b-kube-api-access-7c4bg\") on node \"crc\" DevicePath \"\"" Oct 03 15:08:58 crc kubenswrapper[4962]: I1003 15:08:58.213525 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76b8d89e03efe87fcb516c1930323c45f0f1f4fdd6378b9af8dc69b60d51574e" Oct 03 15:08:58 crc kubenswrapper[4962]: I1003 15:08:58.213564 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/crc-debug-dcvdq" Oct 03 15:08:58 crc kubenswrapper[4962]: I1003 15:08:58.253377 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b7c3d77-aa59-40b3-9a83-4469787e8f5b" path="/var/lib/kubelet/pods/5b7c3d77-aa59-40b3-9a83-4469787e8f5b/volumes" Oct 03 15:09:01 crc kubenswrapper[4962]: I1003 15:09:01.934551 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7_963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb/util/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.100933 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7_963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb/util/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.150456 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7_963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb/pull/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.221390 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7_963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb/pull/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.366457 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7_963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb/util/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.373974 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7_963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb/extract/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.426665 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6866606fa3a289e0b44cd13ac7038d6356f0a6aa62e0445808c76e969akjxg7_963f86ec-e5fa-43b7-b46a-d9c1e6ad9cdb/pull/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.580828 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-np2sg_141599a7-33ff-4db6-b265-1f0e3407fdf5/kube-rbac-proxy/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.698261 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-hmld4_e78401d0-e96e-41ff-b4f6-97d72553280b/kube-rbac-proxy/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.723102 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-np2sg_141599a7-33ff-4db6-b265-1f0e3407fdf5/manager/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.886510 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-hmld4_e78401d0-e96e-41ff-b4f6-97d72553280b/manager/0.log" Oct 03 15:09:02 crc kubenswrapper[4962]: I1003 15:09:02.964959 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-7gwjc_9e4441f1-c997-4e2e-b2b9-ff6e05718dfd/kube-rbac-proxy/0.log" Oct 03 15:09:03 crc kubenswrapper[4962]: I1003 15:09:03.022471 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-7gwjc_9e4441f1-c997-4e2e-b2b9-ff6e05718dfd/manager/0.log" Oct 03 15:09:03 crc kubenswrapper[4962]: I1003 15:09:03.215529 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-pf6wd_b7991083-22d5-42c8-8df0-6e93acee716b/kube-rbac-proxy/0.log" Oct 03 15:09:03 crc kubenswrapper[4962]: I1003 15:09:03.332078 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-pf6wd_b7991083-22d5-42c8-8df0-6e93acee716b/manager/0.log" Oct 03 15:09:03 crc kubenswrapper[4962]: I1003 15:09:03.378397 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-82zrm_2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6/kube-rbac-proxy/0.log" Oct 03 15:09:03 crc kubenswrapper[4962]: I1003 15:09:03.524144 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-82zrm_2cf61ca5-fa4e-4f11-bfed-0a81aa7140a6/manager/0.log" Oct 03 15:09:03 crc kubenswrapper[4962]: I1003 15:09:03.562430 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-6tjz7_df253268-0372-4cff-a9a0-ff9d4d8eac7b/kube-rbac-proxy/0.log" Oct 03 15:09:03 crc kubenswrapper[4962]: I1003 15:09:03.610240 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-6tjz7_df253268-0372-4cff-a9a0-ff9d4d8eac7b/manager/0.log" Oct 03 15:09:03 crc kubenswrapper[4962]: I1003 15:09:03.873030 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-n2xlz_e7bd048c-266e-45eb-8755-1bac673b02cc/kube-rbac-proxy/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.069212 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-tnwbc_51054362-5c8b-4866-bb58-c0cff476b726/kube-rbac-proxy/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.101797 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-tnwbc_51054362-5c8b-4866-bb58-c0cff476b726/manager/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.105518 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-n2xlz_e7bd048c-266e-45eb-8755-1bac673b02cc/manager/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.267544 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-nhcsm_564d1ea9-035c-4ac3-8692-907bf54a2d01/kube-rbac-proxy/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.386957 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-9lpjm_592a09ab-92d9-447f-9978-e9da27cc4df9/kube-rbac-proxy/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.436988 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-nhcsm_564d1ea9-035c-4ac3-8692-907bf54a2d01/manager/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.521789 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-9lpjm_592a09ab-92d9-447f-9978-e9da27cc4df9/manager/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.649941 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-gm9lt_fbda4a74-6131-47cb-9098-f23870f67916/kube-rbac-proxy/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.690815 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-gm9lt_fbda4a74-6131-47cb-9098-f23870f67916/manager/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.833705 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-7f4r6_a062596b-8e37-4963-bf83-e37ed388bf83/kube-rbac-proxy/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.939539 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-7f4r6_a062596b-8e37-4963-bf83-e37ed388bf83/manager/0.log" Oct 03 15:09:04 crc kubenswrapper[4962]: I1003 15:09:04.980012 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-r52nf_49eb417d-aaa0-4aab-b107-becad30c4185/kube-rbac-proxy/0.log" Oct 03 15:09:05 crc kubenswrapper[4962]: I1003 15:09:05.185243 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-rdflf_9a84d9fa-a8e7-4280-b500-d4f68500e13e/kube-rbac-proxy/0.log" Oct 03 15:09:05 crc kubenswrapper[4962]: I1003 15:09:05.223129 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-r52nf_49eb417d-aaa0-4aab-b107-becad30c4185/manager/0.log" Oct 03 15:09:05 crc kubenswrapper[4962]: I1003 15:09:05.298192 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-rdflf_9a84d9fa-a8e7-4280-b500-d4f68500e13e/manager/0.log" Oct 03 15:09:05 crc kubenswrapper[4962]: I1003 15:09:05.427116 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7_c0193fb9-bb01-4730-a986-7f03c3b61887/manager/0.log" Oct 03 15:09:05 crc kubenswrapper[4962]: I1003 15:09:05.476172 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678jlpx7_c0193fb9-bb01-4730-a986-7f03c3b61887/kube-rbac-proxy/0.log" Oct 03 15:09:05 crc kubenswrapper[4962]: I1003 15:09:05.871285 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5c4446bf96-p5f6q_0786c124-4dcc-437d-8f1b-80021feb3553/kube-rbac-proxy/0.log" Oct 03 15:09:05 crc kubenswrapper[4962]: I1003 15:09:05.919204 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-764f84468b-skw2k_1b037b6e-0364-4f71-852b-b10f28b37915/kube-rbac-proxy/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.128891 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-764f84468b-skw2k_1b037b6e-0364-4f71-852b-b10f28b37915/operator/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.134557 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mbh5p_703db32b-fa73-48ac-834d-addaf46d6293/registry-server/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.404663 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-q95dg_7ada00d0-e4a6-46a9-872b-e554866a03c6/kube-rbac-proxy/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.462041 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-jq2cl_22736612-9f52-42a5-a773-4389ae1473d0/kube-rbac-proxy/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.502943 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-q95dg_7ada00d0-e4a6-46a9-872b-e554866a03c6/manager/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.602260 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-jq2cl_22736612-9f52-42a5-a773-4389ae1473d0/manager/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.694025 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-j758z_908550fb-fd0f-4363-aaab-3434aae03751/operator/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.820731 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-tgrxn_da3763da-ed79-40cc-bf74-10729759437e/kube-rbac-proxy/0.log" Oct 03 15:09:06 crc kubenswrapper[4962]: I1003 15:09:06.994262 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-tgrxn_da3763da-ed79-40cc-bf74-10729759437e/manager/0.log" Oct 03 15:09:07 crc kubenswrapper[4962]: I1003 15:09:07.067602 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-pgf8s_74fa1f6e-ee12-4c7c-97f0-23586ec9983c/kube-rbac-proxy/0.log" Oct 03 15:09:07 crc kubenswrapper[4962]: I1003 15:09:07.263696 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-pgf8s_74fa1f6e-ee12-4c7c-97f0-23586ec9983c/manager/0.log" Oct 03 15:09:07 crc kubenswrapper[4962]: I1003 15:09:07.267875 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-2k5gq_0708bcfe-3c0e-44d8-b849-fa4464ea3387/kube-rbac-proxy/0.log" Oct 03 15:09:07 crc kubenswrapper[4962]: I1003 15:09:07.333004 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-2k5gq_0708bcfe-3c0e-44d8-b849-fa4464ea3387/manager/0.log" Oct 03 15:09:07 crc kubenswrapper[4962]: I1003 15:09:07.492587 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-2fvpg_0f094302-eaab-4160-b549-530131588472/kube-rbac-proxy/0.log" Oct 03 15:09:07 crc kubenswrapper[4962]: I1003 15:09:07.535388 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-fcd7d9895-2fvpg_0f094302-eaab-4160-b549-530131588472/manager/0.log" Oct 03 15:09:08 crc kubenswrapper[4962]: I1003 15:09:08.020158 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5c4446bf96-p5f6q_0786c124-4dcc-437d-8f1b-80021feb3553/manager/0.log" Oct 03 15:09:22 crc kubenswrapper[4962]: I1003 15:09:22.858482 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pbd2f_ec276030-49bd-4751-93f2-456157bd157d/control-plane-machine-set-operator/0.log" Oct 03 15:09:22 crc kubenswrapper[4962]: I1003 15:09:22.952449 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5zs6v_201d1b81-b0bd-4584-9406-77ac0888ae49/kube-rbac-proxy/0.log" Oct 03 15:09:23 crc kubenswrapper[4962]: I1003 15:09:23.043277 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5zs6v_201d1b81-b0bd-4584-9406-77ac0888ae49/machine-api-operator/0.log" Oct 03 15:09:24 crc kubenswrapper[4962]: I1003 15:09:24.659305 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:09:24 crc kubenswrapper[4962]: I1003 15:09:24.659584 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:09:33 crc kubenswrapper[4962]: I1003 15:09:33.871417 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-pkf2x_32c9743f-647d-44e9-8aa4-5e88b24f6452/cert-manager-controller/0.log" Oct 03 15:09:33 crc kubenswrapper[4962]: I1003 15:09:33.979240 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-q25jk_fcf33643-9678-4f5e-958a-a1df723d0497/cert-manager-cainjector/0.log" Oct 03 15:09:34 crc kubenswrapper[4962]: I1003 15:09:34.061358 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-sfwbb_3429db32-6f0c-4b9c-8c3f-fd257b45d362/cert-manager-webhook/0.log" Oct 03 15:09:44 crc kubenswrapper[4962]: I1003 15:09:44.616985 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-fzscf_f87d275d-37cc-496a-bb07-d50b2905a494/nmstate-console-plugin/0.log" Oct 03 15:09:44 crc kubenswrapper[4962]: I1003 15:09:44.858883 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-rrlzt_08ef24b1-9d00-4fa5-92e7-b28d3c1795c8/kube-rbac-proxy/0.log" Oct 03 15:09:44 crc kubenswrapper[4962]: I1003 15:09:44.867007 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7wnjl_ce2302c0-a558-4b5e-bf69-40c48e9c4e31/nmstate-handler/0.log" Oct 03 15:09:44 crc kubenswrapper[4962]: I1003 15:09:44.917122 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-rrlzt_08ef24b1-9d00-4fa5-92e7-b28d3c1795c8/nmstate-metrics/0.log" Oct 03 15:09:45 crc kubenswrapper[4962]: I1003 15:09:45.040489 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-klzjc_608a97cf-8aef-44fd-aca4-57c6a896b7c8/nmstate-operator/0.log" Oct 03 15:09:45 crc kubenswrapper[4962]: I1003 15:09:45.129849 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-xmr88_b7a87281-61a2-478d-8ddf-faa13fcd21e4/nmstate-webhook/0.log" Oct 03 15:09:54 crc kubenswrapper[4962]: I1003 15:09:54.660147 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:09:54 crc kubenswrapper[4962]: I1003 15:09:54.660739 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:09:54 crc kubenswrapper[4962]: I1003 15:09:54.660791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 15:09:54 crc kubenswrapper[4962]: I1003 15:09:54.661651 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa48c1476389c502f3e31fce1f2bbf3e558dfcb2d299b8314600dcf50b479976"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:09:54 crc kubenswrapper[4962]: I1003 15:09:54.661720 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://aa48c1476389c502f3e31fce1f2bbf3e558dfcb2d299b8314600dcf50b479976" gracePeriod=600 Oct 03 15:09:55 crc kubenswrapper[4962]: I1003 15:09:55.768075 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="aa48c1476389c502f3e31fce1f2bbf3e558dfcb2d299b8314600dcf50b479976" exitCode=0 Oct 03 15:09:55 crc kubenswrapper[4962]: I1003 15:09:55.768717 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"aa48c1476389c502f3e31fce1f2bbf3e558dfcb2d299b8314600dcf50b479976"} Oct 03 15:09:55 crc kubenswrapper[4962]: I1003 15:09:55.768746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerStarted","Data":"9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5"} Oct 03 15:09:55 crc kubenswrapper[4962]: I1003 15:09:55.768763 4962 scope.go:117] "RemoveContainer" containerID="9e969c5b4663b9266408ae81a93dc573a565f6f23975eb2be7d5867d2c0fb330" Oct 03 15:09:59 crc kubenswrapper[4962]: I1003 15:09:59.413796 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-n7d8j_25aa504c-2ffc-462d-b138-c89c0f3083ce/kube-rbac-proxy/0.log" Oct 03 15:09:59 crc kubenswrapper[4962]: I1003 15:09:59.710250 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-frr-files/0.log" Oct 03 15:09:59 crc kubenswrapper[4962]: I1003 15:09:59.923504 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-n7d8j_25aa504c-2ffc-462d-b138-c89c0f3083ce/controller/0.log" Oct 03 15:09:59 crc kubenswrapper[4962]: I1003 15:09:59.945930 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-frr-files/0.log" Oct 03 15:09:59 crc kubenswrapper[4962]: I1003 15:09:59.994895 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-reloader/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.017408 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-metrics/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.131509 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-reloader/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.272341 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-reloader/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.291630 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-metrics/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.322808 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-frr-files/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.368314 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-metrics/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.522322 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-frr-files/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.526811 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-reloader/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.541175 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/cp-metrics/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.595512 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/controller/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.769611 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/kube-rbac-proxy/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.787665 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/frr-metrics/0.log" Oct 03 15:10:00 crc kubenswrapper[4962]: I1003 15:10:00.805781 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/kube-rbac-proxy-frr/0.log" Oct 03 15:10:01 crc kubenswrapper[4962]: I1003 15:10:01.071466 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-7t4l5_aa1a4478-7484-4891-9130-3454b87cb614/frr-k8s-webhook-server/0.log" Oct 03 15:10:01 crc kubenswrapper[4962]: I1003 15:10:01.082629 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/reloader/0.log" Oct 03 15:10:01 crc kubenswrapper[4962]: I1003 15:10:01.334835 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6b44475b5f-pk5nl_513e7194-e391-4d8d-bb6a-520346bc5aa1/manager/0.log" Oct 03 15:10:01 crc kubenswrapper[4962]: I1003 15:10:01.490035 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69d7579867-lwlnc_c9f2a4e9-25bc-4236-a860-96878c0dddd3/webhook-server/0.log" Oct 03 15:10:01 crc kubenswrapper[4962]: I1003 15:10:01.618703 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-t5hhv_b4f8f5c2-fdb9-4905-8173-c6c709d8565f/kube-rbac-proxy/0.log" Oct 03 15:10:02 crc kubenswrapper[4962]: I1003 15:10:02.571713 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-t5hhv_b4f8f5c2-fdb9-4905-8173-c6c709d8565f/speaker/0.log" Oct 03 15:10:03 crc kubenswrapper[4962]: I1003 15:10:03.664952 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ftmjr_ae7d2439-9f77-42b9-8b22-ca0980f9650d/frr/0.log" Oct 03 15:10:14 crc kubenswrapper[4962]: I1003 15:10:14.762931 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz_1559ba35-102a-4157-8499-10dcbc2241d8/util/0.log" Oct 03 15:10:14 crc kubenswrapper[4962]: I1003 15:10:14.909342 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz_1559ba35-102a-4157-8499-10dcbc2241d8/pull/0.log" Oct 03 15:10:14 crc kubenswrapper[4962]: I1003 15:10:14.949462 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz_1559ba35-102a-4157-8499-10dcbc2241d8/util/0.log" Oct 03 15:10:14 crc kubenswrapper[4962]: I1003 15:10:14.954782 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz_1559ba35-102a-4157-8499-10dcbc2241d8/pull/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.071573 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz_1559ba35-102a-4157-8499-10dcbc2241d8/util/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.109618 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz_1559ba35-102a-4157-8499-10dcbc2241d8/pull/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.157973 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69tq6zz_1559ba35-102a-4157-8499-10dcbc2241d8/extract/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.238795 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4_eba30427-49e1-469f-8d80-bacb89f94b5a/util/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.386993 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4_eba30427-49e1-469f-8d80-bacb89f94b5a/util/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.418055 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4_eba30427-49e1-469f-8d80-bacb89f94b5a/pull/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.470367 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4_eba30427-49e1-469f-8d80-bacb89f94b5a/pull/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.587362 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4_eba30427-49e1-469f-8d80-bacb89f94b5a/util/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.594612 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4_eba30427-49e1-469f-8d80-bacb89f94b5a/pull/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.602121 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2twpl4_eba30427-49e1-469f-8d80-bacb89f94b5a/extract/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.754079 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b_96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f/util/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.948751 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b_96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f/pull/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.948829 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b_96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f/pull/0.log" Oct 03 15:10:15 crc kubenswrapper[4962]: I1003 15:10:15.985089 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b_96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f/util/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.114573 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b_96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f/util/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.142702 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b_96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f/pull/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.159864 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dj2k5b_96ddfe55-8a74-4f1b-96d6-5da6bfeb8e6f/extract/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.297301 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zccd_d8fdbf9f-0043-4acd-9221-4aafad299c55/extract-utilities/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.467631 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zccd_d8fdbf9f-0043-4acd-9221-4aafad299c55/extract-content/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.471832 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zccd_d8fdbf9f-0043-4acd-9221-4aafad299c55/extract-utilities/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.493028 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zccd_d8fdbf9f-0043-4acd-9221-4aafad299c55/extract-content/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.715200 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zccd_d8fdbf9f-0043-4acd-9221-4aafad299c55/extract-utilities/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.751519 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zccd_d8fdbf9f-0043-4acd-9221-4aafad299c55/extract-content/0.log" Oct 03 15:10:16 crc kubenswrapper[4962]: I1003 15:10:16.958331 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bz646_e82e33da-9e3c-4236-864b-ef04b7998a89/extract-utilities/0.log" Oct 03 15:10:17 crc kubenswrapper[4962]: I1003 15:10:17.202952 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bz646_e82e33da-9e3c-4236-864b-ef04b7998a89/extract-utilities/0.log" Oct 03 15:10:17 crc kubenswrapper[4962]: I1003 15:10:17.229032 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bz646_e82e33da-9e3c-4236-864b-ef04b7998a89/extract-content/0.log" Oct 03 15:10:17 crc kubenswrapper[4962]: I1003 15:10:17.263006 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bz646_e82e33da-9e3c-4236-864b-ef04b7998a89/extract-content/0.log" Oct 03 15:10:17 crc kubenswrapper[4962]: I1003 15:10:17.511412 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bz646_e82e33da-9e3c-4236-864b-ef04b7998a89/extract-utilities/0.log" Oct 03 15:10:17 crc kubenswrapper[4962]: I1003 15:10:17.516771 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bz646_e82e33da-9e3c-4236-864b-ef04b7998a89/extract-content/0.log" Oct 03 15:10:17 crc kubenswrapper[4962]: I1003 15:10:17.715279 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zccd_d8fdbf9f-0043-4acd-9221-4aafad299c55/registry-server/0.log" Oct 03 15:10:17 crc kubenswrapper[4962]: I1003 15:10:17.763447 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt_f4861596-c6c9-4979-b4d8-fe9858724265/util/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.070649 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt_f4861596-c6c9-4979-b4d8-fe9858724265/util/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.210969 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt_f4861596-c6c9-4979-b4d8-fe9858724265/pull/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.263180 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt_f4861596-c6c9-4979-b4d8-fe9858724265/pull/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.448727 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt_f4861596-c6c9-4979-b4d8-fe9858724265/pull/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.477190 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt_f4861596-c6c9-4979-b4d8-fe9858724265/util/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.523224 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjv8jt_f4861596-c6c9-4979-b4d8-fe9858724265/extract/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.698886 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bz646_e82e33da-9e3c-4236-864b-ef04b7998a89/registry-server/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.724391 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-snkcz_9908d2b4-0cf3-4635-855c-39eb963de62c/marketplace-operator/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.742475 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zx6zr_aa2a143e-ac08-45ee-9c99-bf61bd19a9e5/extract-utilities/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.915495 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zx6zr_aa2a143e-ac08-45ee-9c99-bf61bd19a9e5/extract-content/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.931645 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zx6zr_aa2a143e-ac08-45ee-9c99-bf61bd19a9e5/extract-content/0.log" Oct 03 15:10:18 crc kubenswrapper[4962]: I1003 15:10:18.950740 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zx6zr_aa2a143e-ac08-45ee-9c99-bf61bd19a9e5/extract-utilities/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.137728 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zx6zr_aa2a143e-ac08-45ee-9c99-bf61bd19a9e5/extract-content/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.143227 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zx6zr_aa2a143e-ac08-45ee-9c99-bf61bd19a9e5/extract-utilities/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.167692 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zxxpl_5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55/extract-utilities/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.492697 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zxxpl_5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55/extract-utilities/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.494273 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zx6zr_aa2a143e-ac08-45ee-9c99-bf61bd19a9e5/registry-server/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.511757 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zxxpl_5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55/extract-content/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.520208 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zxxpl_5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55/extract-content/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.691365 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zxxpl_5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55/extract-utilities/0.log" Oct 03 15:10:19 crc kubenswrapper[4962]: I1003 15:10:19.727247 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zxxpl_5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55/extract-content/0.log" Oct 03 15:10:20 crc kubenswrapper[4962]: I1003 15:10:20.571956 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zxxpl_5e76b6ee-0f7c-4c25-9a2e-20e210d5ec55/registry-server/0.log" Oct 03 15:10:31 crc kubenswrapper[4962]: I1003 15:10:31.104687 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-f4nnk_266013b3-c058-4cc7-aa50-b02f13810284/prometheus-operator/0.log" Oct 03 15:10:31 crc kubenswrapper[4962]: I1003 15:10:31.263391 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5996f7f648-p7khz_7fa44c96-655f-4152-8681-0c8090139b68/prometheus-operator-admission-webhook/0.log" Oct 03 15:10:31 crc kubenswrapper[4962]: I1003 15:10:31.337902 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5996f7f648-wslh7_1985af21-3784-432c-9a82-0a71b8c5f830/prometheus-operator-admission-webhook/0.log" Oct 03 15:10:31 crc kubenswrapper[4962]: I1003 15:10:31.448833 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-kmgdq_f5832b59-1c49-493e-a996-b83f81d1e279/operator/0.log" Oct 03 15:10:31 crc kubenswrapper[4962]: I1003 15:10:31.533873 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-6zl9z_02a8ad14-f79e-44dc-8c54-31e3702b287c/perses-operator/0.log" Oct 03 15:10:55 crc kubenswrapper[4962]: E1003 15:10:55.932715 4962 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.172:37720->38.129.56.172:43051: write tcp 38.129.56.172:37720->38.129.56.172:43051: write: broken pipe Oct 03 15:11:04 crc kubenswrapper[4962]: I1003 15:11:04.850862 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dp4f5"] Oct 03 15:11:04 crc kubenswrapper[4962]: E1003 15:11:04.851942 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7c3d77-aa59-40b3-9a83-4469787e8f5b" containerName="container-00" Oct 03 15:11:04 crc kubenswrapper[4962]: I1003 15:11:04.851958 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7c3d77-aa59-40b3-9a83-4469787e8f5b" containerName="container-00" Oct 03 15:11:04 crc kubenswrapper[4962]: I1003 15:11:04.852211 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7c3d77-aa59-40b3-9a83-4469787e8f5b" containerName="container-00" Oct 03 15:11:04 crc kubenswrapper[4962]: I1003 15:11:04.854177 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:04 crc kubenswrapper[4962]: I1003 15:11:04.863070 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dp4f5"] Oct 03 15:11:04 crc kubenswrapper[4962]: I1003 15:11:04.984449 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-catalog-content\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:04 crc kubenswrapper[4962]: I1003 15:11:04.984721 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-utilities\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:04 crc kubenswrapper[4962]: I1003 15:11:04.984905 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87fjc\" (UniqueName: \"kubernetes.io/projected/0c9114c6-38f7-449a-a4e9-44cd399cdaac-kube-api-access-87fjc\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:05 crc kubenswrapper[4962]: I1003 15:11:05.086781 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87fjc\" (UniqueName: \"kubernetes.io/projected/0c9114c6-38f7-449a-a4e9-44cd399cdaac-kube-api-access-87fjc\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:05 crc kubenswrapper[4962]: I1003 15:11:05.086903 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-catalog-content\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:05 crc kubenswrapper[4962]: I1003 15:11:05.087042 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-utilities\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:05 crc kubenswrapper[4962]: I1003 15:11:05.087453 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-catalog-content\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:05 crc kubenswrapper[4962]: I1003 15:11:05.087548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-utilities\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:05 crc kubenswrapper[4962]: I1003 15:11:05.110747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87fjc\" (UniqueName: \"kubernetes.io/projected/0c9114c6-38f7-449a-a4e9-44cd399cdaac-kube-api-access-87fjc\") pod \"redhat-operators-dp4f5\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:05 crc kubenswrapper[4962]: I1003 15:11:05.181758 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:05 crc kubenswrapper[4962]: I1003 15:11:05.794305 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dp4f5"] Oct 03 15:11:06 crc kubenswrapper[4962]: I1003 15:11:06.545937 4962 generic.go:334] "Generic (PLEG): container finished" podID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerID="9017a2c12d4d548b3b91b90e02818e0b2791641c3527fbb804b8672ccf1ebe47" exitCode=0 Oct 03 15:11:06 crc kubenswrapper[4962]: I1003 15:11:06.546072 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4f5" event={"ID":"0c9114c6-38f7-449a-a4e9-44cd399cdaac","Type":"ContainerDied","Data":"9017a2c12d4d548b3b91b90e02818e0b2791641c3527fbb804b8672ccf1ebe47"} Oct 03 15:11:06 crc kubenswrapper[4962]: I1003 15:11:06.546237 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4f5" event={"ID":"0c9114c6-38f7-449a-a4e9-44cd399cdaac","Type":"ContainerStarted","Data":"2b614cb038392a33b2a0a67f0a6a66062df206b2a39a7e04b117178b5a5ebac4"} Oct 03 15:11:06 crc kubenswrapper[4962]: I1003 15:11:06.548853 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:11:08 crc kubenswrapper[4962]: I1003 15:11:08.564072 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4f5" event={"ID":"0c9114c6-38f7-449a-a4e9-44cd399cdaac","Type":"ContainerStarted","Data":"3555fc12a9a28e5618f28b44a14f4758a7bde75df7b4582e015cfa4e4d9a27e5"} Oct 03 15:11:12 crc kubenswrapper[4962]: I1003 15:11:12.602721 4962 generic.go:334] "Generic (PLEG): container finished" podID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerID="3555fc12a9a28e5618f28b44a14f4758a7bde75df7b4582e015cfa4e4d9a27e5" exitCode=0 Oct 03 15:11:12 crc kubenswrapper[4962]: I1003 15:11:12.602913 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4f5" event={"ID":"0c9114c6-38f7-449a-a4e9-44cd399cdaac","Type":"ContainerDied","Data":"3555fc12a9a28e5618f28b44a14f4758a7bde75df7b4582e015cfa4e4d9a27e5"} Oct 03 15:11:13 crc kubenswrapper[4962]: I1003 15:11:13.615378 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4f5" event={"ID":"0c9114c6-38f7-449a-a4e9-44cd399cdaac","Type":"ContainerStarted","Data":"8db08b9e673f02eb4cb1c762be773340abcf37cb87942b9660ad730e4637b92a"} Oct 03 15:11:13 crc kubenswrapper[4962]: I1003 15:11:13.640546 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dp4f5" podStartSLOduration=3.051064919 podStartE2EDuration="9.640513s" podCreationTimestamp="2025-10-03 15:11:04 +0000 UTC" firstStartedPulling="2025-10-03 15:11:06.548606512 +0000 UTC m=+8474.952504347" lastFinishedPulling="2025-10-03 15:11:13.138054603 +0000 UTC m=+8481.541952428" observedRunningTime="2025-10-03 15:11:13.629805055 +0000 UTC m=+8482.033702890" watchObservedRunningTime="2025-10-03 15:11:13.640513 +0000 UTC m=+8482.044410835" Oct 03 15:11:15 crc kubenswrapper[4962]: I1003 15:11:15.183032 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:15 crc kubenswrapper[4962]: I1003 15:11:15.183343 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:16 crc kubenswrapper[4962]: I1003 15:11:16.264550 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dp4f5" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="registry-server" probeResult="failure" output=< Oct 03 15:11:16 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Oct 03 15:11:16 crc kubenswrapper[4962]: > Oct 03 15:11:19 crc kubenswrapper[4962]: I1003 15:11:19.931107 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ftsst"] Oct 03 15:11:19 crc kubenswrapper[4962]: I1003 15:11:19.934240 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:19 crc kubenswrapper[4962]: I1003 15:11:19.945257 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftsst"] Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.032206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-utilities\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.032450 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-catalog-content\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.032496 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5462\" (UniqueName: \"kubernetes.io/projected/93ab4273-9bb0-4395-bb9b-f17637594e1d-kube-api-access-j5462\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.133951 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-utilities\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.134104 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-catalog-content\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.134140 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5462\" (UniqueName: \"kubernetes.io/projected/93ab4273-9bb0-4395-bb9b-f17637594e1d-kube-api-access-j5462\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.134590 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-utilities\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.134628 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-catalog-content\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.159126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5462\" (UniqueName: \"kubernetes.io/projected/93ab4273-9bb0-4395-bb9b-f17637594e1d-kube-api-access-j5462\") pod \"redhat-marketplace-ftsst\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.268403 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:20 crc kubenswrapper[4962]: I1003 15:11:20.716200 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftsst"] Oct 03 15:11:20 crc kubenswrapper[4962]: W1003 15:11:20.718304 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ab4273_9bb0_4395_bb9b_f17637594e1d.slice/crio-374d26747560ee46ba54322ea3ed343fb27d11e032c9c887f9a1a79ccd5f59a2 WatchSource:0}: Error finding container 374d26747560ee46ba54322ea3ed343fb27d11e032c9c887f9a1a79ccd5f59a2: Status 404 returned error can't find the container with id 374d26747560ee46ba54322ea3ed343fb27d11e032c9c887f9a1a79ccd5f59a2 Oct 03 15:11:21 crc kubenswrapper[4962]: I1003 15:11:21.694692 4962 generic.go:334] "Generic (PLEG): container finished" podID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerID="849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e" exitCode=0 Oct 03 15:11:21 crc kubenswrapper[4962]: I1003 15:11:21.694757 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftsst" event={"ID":"93ab4273-9bb0-4395-bb9b-f17637594e1d","Type":"ContainerDied","Data":"849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e"} Oct 03 15:11:21 crc kubenswrapper[4962]: I1003 15:11:21.695010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftsst" event={"ID":"93ab4273-9bb0-4395-bb9b-f17637594e1d","Type":"ContainerStarted","Data":"374d26747560ee46ba54322ea3ed343fb27d11e032c9c887f9a1a79ccd5f59a2"} Oct 03 15:11:23 crc kubenswrapper[4962]: I1003 15:11:23.713583 4962 generic.go:334] "Generic (PLEG): container finished" podID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerID="58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282" exitCode=0 Oct 03 15:11:23 crc kubenswrapper[4962]: I1003 15:11:23.713716 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftsst" event={"ID":"93ab4273-9bb0-4395-bb9b-f17637594e1d","Type":"ContainerDied","Data":"58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282"} Oct 03 15:11:24 crc kubenswrapper[4962]: I1003 15:11:24.724455 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftsst" event={"ID":"93ab4273-9bb0-4395-bb9b-f17637594e1d","Type":"ContainerStarted","Data":"27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361"} Oct 03 15:11:24 crc kubenswrapper[4962]: I1003 15:11:24.743139 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ftsst" podStartSLOduration=3.257299635 podStartE2EDuration="5.743123424s" podCreationTimestamp="2025-10-03 15:11:19 +0000 UTC" firstStartedPulling="2025-10-03 15:11:21.696561415 +0000 UTC m=+8490.100459250" lastFinishedPulling="2025-10-03 15:11:24.182385204 +0000 UTC m=+8492.586283039" observedRunningTime="2025-10-03 15:11:24.741731277 +0000 UTC m=+8493.145629112" watchObservedRunningTime="2025-10-03 15:11:24.743123424 +0000 UTC m=+8493.147021259" Oct 03 15:11:25 crc kubenswrapper[4962]: I1003 15:11:25.245051 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:25 crc kubenswrapper[4962]: I1003 15:11:25.310853 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:29 crc kubenswrapper[4962]: I1003 15:11:29.520986 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dp4f5"] Oct 03 15:11:29 crc kubenswrapper[4962]: I1003 15:11:29.521587 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dp4f5" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="registry-server" containerID="cri-o://8db08b9e673f02eb4cb1c762be773340abcf37cb87942b9660ad730e4637b92a" gracePeriod=2 Oct 03 15:11:29 crc kubenswrapper[4962]: I1003 15:11:29.772113 4962 generic.go:334] "Generic (PLEG): container finished" podID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerID="8db08b9e673f02eb4cb1c762be773340abcf37cb87942b9660ad730e4637b92a" exitCode=0 Oct 03 15:11:29 crc kubenswrapper[4962]: I1003 15:11:29.772155 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4f5" event={"ID":"0c9114c6-38f7-449a-a4e9-44cd399cdaac","Type":"ContainerDied","Data":"8db08b9e673f02eb4cb1c762be773340abcf37cb87942b9660ad730e4637b92a"} Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.075822 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.252572 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-utilities\") pod \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.252839 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-catalog-content\") pod \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.253073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87fjc\" (UniqueName: \"kubernetes.io/projected/0c9114c6-38f7-449a-a4e9-44cd399cdaac-kube-api-access-87fjc\") pod \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\" (UID: \"0c9114c6-38f7-449a-a4e9-44cd399cdaac\") " Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.253628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-utilities" (OuterVolumeSpecName: "utilities") pod "0c9114c6-38f7-449a-a4e9-44cd399cdaac" (UID: "0c9114c6-38f7-449a-a4e9-44cd399cdaac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.262192 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9114c6-38f7-449a-a4e9-44cd399cdaac-kube-api-access-87fjc" (OuterVolumeSpecName: "kube-api-access-87fjc") pod "0c9114c6-38f7-449a-a4e9-44cd399cdaac" (UID: "0c9114c6-38f7-449a-a4e9-44cd399cdaac"). InnerVolumeSpecName "kube-api-access-87fjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.269039 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.269116 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.332952 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c9114c6-38f7-449a-a4e9-44cd399cdaac" (UID: "0c9114c6-38f7-449a-a4e9-44cd399cdaac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.336137 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.357997 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.358040 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9114c6-38f7-449a-a4e9-44cd399cdaac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.358056 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87fjc\" (UniqueName: \"kubernetes.io/projected/0c9114c6-38f7-449a-a4e9-44cd399cdaac-kube-api-access-87fjc\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.804836 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4f5" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.804890 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4f5" event={"ID":"0c9114c6-38f7-449a-a4e9-44cd399cdaac","Type":"ContainerDied","Data":"2b614cb038392a33b2a0a67f0a6a66062df206b2a39a7e04b117178b5a5ebac4"} Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.804934 4962 scope.go:117] "RemoveContainer" containerID="8db08b9e673f02eb4cb1c762be773340abcf37cb87942b9660ad730e4637b92a" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.837341 4962 scope.go:117] "RemoveContainer" containerID="3555fc12a9a28e5618f28b44a14f4758a7bde75df7b4582e015cfa4e4d9a27e5" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.843943 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dp4f5"] Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.852813 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dp4f5"] Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.858258 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:30 crc kubenswrapper[4962]: I1003 15:11:30.862806 4962 scope.go:117] "RemoveContainer" containerID="9017a2c12d4d548b3b91b90e02818e0b2791641c3527fbb804b8672ccf1ebe47" Oct 03 15:11:32 crc kubenswrapper[4962]: I1003 15:11:32.239192 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" path="/var/lib/kubelet/pods/0c9114c6-38f7-449a-a4e9-44cd399cdaac/volumes" Oct 03 15:11:32 crc kubenswrapper[4962]: I1003 15:11:32.917945 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftsst"] Oct 03 15:11:32 crc kubenswrapper[4962]: I1003 15:11:32.918460 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ftsst" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerName="registry-server" containerID="cri-o://27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361" gracePeriod=2 Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.379829 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.524361 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-catalog-content\") pod \"93ab4273-9bb0-4395-bb9b-f17637594e1d\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.524465 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5462\" (UniqueName: \"kubernetes.io/projected/93ab4273-9bb0-4395-bb9b-f17637594e1d-kube-api-access-j5462\") pod \"93ab4273-9bb0-4395-bb9b-f17637594e1d\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.524689 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-utilities\") pod \"93ab4273-9bb0-4395-bb9b-f17637594e1d\" (UID: \"93ab4273-9bb0-4395-bb9b-f17637594e1d\") " Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.526225 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-utilities" (OuterVolumeSpecName: "utilities") pod "93ab4273-9bb0-4395-bb9b-f17637594e1d" (UID: "93ab4273-9bb0-4395-bb9b-f17637594e1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.536862 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ab4273-9bb0-4395-bb9b-f17637594e1d-kube-api-access-j5462" (OuterVolumeSpecName: "kube-api-access-j5462") pod "93ab4273-9bb0-4395-bb9b-f17637594e1d" (UID: "93ab4273-9bb0-4395-bb9b-f17637594e1d"). InnerVolumeSpecName "kube-api-access-j5462". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.539709 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93ab4273-9bb0-4395-bb9b-f17637594e1d" (UID: "93ab4273-9bb0-4395-bb9b-f17637594e1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.627438 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.627480 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5462\" (UniqueName: \"kubernetes.io/projected/93ab4273-9bb0-4395-bb9b-f17637594e1d-kube-api-access-j5462\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.627491 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ab4273-9bb0-4395-bb9b-f17637594e1d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.855511 4962 generic.go:334] "Generic (PLEG): container finished" podID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerID="27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361" exitCode=0 Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.855589 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftsst" event={"ID":"93ab4273-9bb0-4395-bb9b-f17637594e1d","Type":"ContainerDied","Data":"27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361"} Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.856097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftsst" event={"ID":"93ab4273-9bb0-4395-bb9b-f17637594e1d","Type":"ContainerDied","Data":"374d26747560ee46ba54322ea3ed343fb27d11e032c9c887f9a1a79ccd5f59a2"} Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.855613 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ftsst" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.856173 4962 scope.go:117] "RemoveContainer" containerID="27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.878817 4962 scope.go:117] "RemoveContainer" containerID="58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.897858 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftsst"] Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.910135 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftsst"] Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.920416 4962 scope.go:117] "RemoveContainer" containerID="849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.967848 4962 scope.go:117] "RemoveContainer" containerID="27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361" Oct 03 15:11:33 crc kubenswrapper[4962]: E1003 15:11:33.968340 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361\": container with ID starting with 27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361 not found: ID does not exist" containerID="27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.968386 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361"} err="failed to get container status \"27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361\": rpc error: code = NotFound desc = could not find container \"27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361\": container with ID starting with 27df065d86c0b54aa771bc6b8d315113d5f29c2caa85505d5e0ef539654ba361 not found: ID does not exist" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.968438 4962 scope.go:117] "RemoveContainer" containerID="58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282" Oct 03 15:11:33 crc kubenswrapper[4962]: E1003 15:11:33.968942 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282\": container with ID starting with 58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282 not found: ID does not exist" containerID="58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.968972 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282"} err="failed to get container status \"58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282\": rpc error: code = NotFound desc = could not find container \"58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282\": container with ID starting with 58639704a6d343addb99e19724d608ff734fc62ff1ac5108e8d3f6ca31633282 not found: ID does not exist" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.968988 4962 scope.go:117] "RemoveContainer" containerID="849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e" Oct 03 15:11:33 crc kubenswrapper[4962]: E1003 15:11:33.969493 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e\": container with ID starting with 849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e not found: ID does not exist" containerID="849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e" Oct 03 15:11:33 crc kubenswrapper[4962]: I1003 15:11:33.969522 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e"} err="failed to get container status \"849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e\": rpc error: code = NotFound desc = could not find container \"849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e\": container with ID starting with 849b53761fc7c7c27f2e10413d0e6e4b2777ee35f9bf00335ebaf267f572ee2e not found: ID does not exist" Oct 03 15:11:34 crc kubenswrapper[4962]: I1003 15:11:34.239415 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" path="/var/lib/kubelet/pods/93ab4273-9bb0-4395-bb9b-f17637594e1d/volumes" Oct 03 15:11:54 crc kubenswrapper[4962]: I1003 15:11:54.659970 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:11:54 crc kubenswrapper[4962]: I1003 15:11:54.660512 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:12:24 crc kubenswrapper[4962]: I1003 15:12:24.660412 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:12:24 crc kubenswrapper[4962]: I1003 15:12:24.661043 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:12:54 crc kubenswrapper[4962]: I1003 15:12:54.659581 4962 patch_prober.go:28] interesting pod/machine-config-daemon-46vck container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:12:54 crc kubenswrapper[4962]: I1003 15:12:54.660627 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:12:54 crc kubenswrapper[4962]: I1003 15:12:54.660809 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-46vck" Oct 03 15:12:54 crc kubenswrapper[4962]: I1003 15:12:54.662590 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5"} pod="openshift-machine-config-operator/machine-config-daemon-46vck" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:12:54 crc kubenswrapper[4962]: I1003 15:12:54.662770 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerName="machine-config-daemon" containerID="cri-o://9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" gracePeriod=600 Oct 03 15:12:54 crc kubenswrapper[4962]: E1003 15:12:54.793415 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:12:55 crc kubenswrapper[4962]: I1003 15:12:55.719244 4962 generic.go:334] "Generic (PLEG): container finished" podID="e40a27aa-682e-4b25-a198-8054ba9f2477" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" exitCode=0 Oct 03 15:12:55 crc kubenswrapper[4962]: I1003 15:12:55.719313 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-46vck" event={"ID":"e40a27aa-682e-4b25-a198-8054ba9f2477","Type":"ContainerDied","Data":"9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5"} Oct 03 15:12:55 crc kubenswrapper[4962]: I1003 15:12:55.719628 4962 scope.go:117] "RemoveContainer" containerID="aa48c1476389c502f3e31fce1f2bbf3e558dfcb2d299b8314600dcf50b479976" Oct 03 15:12:55 crc kubenswrapper[4962]: I1003 15:12:55.720458 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:12:55 crc kubenswrapper[4962]: E1003 15:12:55.720814 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:13:00 crc kubenswrapper[4962]: I1003 15:13:00.773037 4962 generic.go:334] "Generic (PLEG): container finished" podID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerID="abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786" exitCode=0 Oct 03 15:13:00 crc kubenswrapper[4962]: I1003 15:13:00.773140 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qgjqc/must-gather-dwknh" event={"ID":"12468d83-dfda-49d1-8184-efe2d4226e3c","Type":"ContainerDied","Data":"abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786"} Oct 03 15:13:00 crc kubenswrapper[4962]: I1003 15:13:00.774154 4962 scope.go:117] "RemoveContainer" containerID="abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786" Oct 03 15:13:01 crc kubenswrapper[4962]: I1003 15:13:01.619139 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qgjqc_must-gather-dwknh_12468d83-dfda-49d1-8184-efe2d4226e3c/gather/0.log" Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.304437 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qgjqc/must-gather-dwknh"] Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.305596 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qgjqc/must-gather-dwknh" podUID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerName="copy" containerID="cri-o://607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c" gracePeriod=2 Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.318118 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qgjqc/must-gather-dwknh"] Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.886759 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qgjqc_must-gather-dwknh_12468d83-dfda-49d1-8184-efe2d4226e3c/copy/0.log" Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.889884 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.913924 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qgjqc_must-gather-dwknh_12468d83-dfda-49d1-8184-efe2d4226e3c/copy/0.log" Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.917182 4962 generic.go:334] "Generic (PLEG): container finished" podID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerID="607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c" exitCode=143 Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.917352 4962 scope.go:117] "RemoveContainer" containerID="607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c" Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.917614 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qgjqc/must-gather-dwknh" Oct 03 15:13:10 crc kubenswrapper[4962]: I1003 15:13:10.972076 4962 scope.go:117] "RemoveContainer" containerID="abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.000965 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjm9s\" (UniqueName: \"kubernetes.io/projected/12468d83-dfda-49d1-8184-efe2d4226e3c-kube-api-access-xjm9s\") pod \"12468d83-dfda-49d1-8184-efe2d4226e3c\" (UID: \"12468d83-dfda-49d1-8184-efe2d4226e3c\") " Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.001034 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/12468d83-dfda-49d1-8184-efe2d4226e3c-must-gather-output\") pod \"12468d83-dfda-49d1-8184-efe2d4226e3c\" (UID: \"12468d83-dfda-49d1-8184-efe2d4226e3c\") " Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.007838 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12468d83-dfda-49d1-8184-efe2d4226e3c-kube-api-access-xjm9s" (OuterVolumeSpecName: "kube-api-access-xjm9s") pod "12468d83-dfda-49d1-8184-efe2d4226e3c" (UID: "12468d83-dfda-49d1-8184-efe2d4226e3c"). InnerVolumeSpecName "kube-api-access-xjm9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.083080 4962 scope.go:117] "RemoveContainer" containerID="607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c" Oct 03 15:13:11 crc kubenswrapper[4962]: E1003 15:13:11.087204 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c\": container with ID starting with 607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c not found: ID does not exist" containerID="607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.087757 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c"} err="failed to get container status \"607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c\": rpc error: code = NotFound desc = could not find container \"607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c\": container with ID starting with 607feffd5a7d9a839972dff48c5b551e2eb7e2208e8defe1aa2e77561542fa3c not found: ID does not exist" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.087854 4962 scope.go:117] "RemoveContainer" containerID="abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786" Oct 03 15:13:11 crc kubenswrapper[4962]: E1003 15:13:11.088262 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786\": container with ID starting with abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786 not found: ID does not exist" containerID="abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.088381 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786"} err="failed to get container status \"abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786\": rpc error: code = NotFound desc = could not find container \"abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786\": container with ID starting with abfcb4f36a3b40eed9acff09a594fd5f516a83ee9e1685f5eb70fb8759af9786 not found: ID does not exist" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.103494 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjm9s\" (UniqueName: \"kubernetes.io/projected/12468d83-dfda-49d1-8184-efe2d4226e3c-kube-api-access-xjm9s\") on node \"crc\" DevicePath \"\"" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.226773 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12468d83-dfda-49d1-8184-efe2d4226e3c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "12468d83-dfda-49d1-8184-efe2d4226e3c" (UID: "12468d83-dfda-49d1-8184-efe2d4226e3c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.228859 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:13:11 crc kubenswrapper[4962]: E1003 15:13:11.229474 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:13:11 crc kubenswrapper[4962]: I1003 15:13:11.310826 4962 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/12468d83-dfda-49d1-8184-efe2d4226e3c-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 15:13:12 crc kubenswrapper[4962]: I1003 15:13:12.240996 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12468d83-dfda-49d1-8184-efe2d4226e3c" path="/var/lib/kubelet/pods/12468d83-dfda-49d1-8184-efe2d4226e3c/volumes" Oct 03 15:13:14 crc kubenswrapper[4962]: I1003 15:13:14.598786 4962 scope.go:117] "RemoveContainer" containerID="603a053ac2208b0c97c5b4e82974ebc5c17ba4834399cb88065d571d500a909b" Oct 03 15:13:26 crc kubenswrapper[4962]: I1003 15:13:26.233222 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:13:26 crc kubenswrapper[4962]: E1003 15:13:26.234034 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:13:40 crc kubenswrapper[4962]: I1003 15:13:40.227485 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:13:40 crc kubenswrapper[4962]: E1003 15:13:40.228314 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:13:54 crc kubenswrapper[4962]: I1003 15:13:54.229551 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:13:54 crc kubenswrapper[4962]: E1003 15:13:54.230495 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:14:09 crc kubenswrapper[4962]: I1003 15:14:09.227287 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:14:09 crc kubenswrapper[4962]: E1003 15:14:09.228355 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:14:20 crc kubenswrapper[4962]: I1003 15:14:20.228009 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:14:20 crc kubenswrapper[4962]: E1003 15:14:20.229383 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:14:34 crc kubenswrapper[4962]: I1003 15:14:34.229927 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:14:34 crc kubenswrapper[4962]: E1003 15:14:34.230853 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:14:46 crc kubenswrapper[4962]: I1003 15:14:46.228364 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:14:46 crc kubenswrapper[4962]: E1003 15:14:46.229330 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.167266 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc"] Oct 03 15:15:00 crc kubenswrapper[4962]: E1003 15:15:00.168473 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="extract-utilities" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.168491 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="extract-utilities" Oct 03 15:15:00 crc kubenswrapper[4962]: E1003 15:15:00.168516 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerName="extract-content" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.168550 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerName="extract-content" Oct 03 15:15:00 crc kubenswrapper[4962]: E1003 15:15:00.168586 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.168595 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4962]: E1003 15:15:00.168609 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerName="copy" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.168616 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerName="copy" Oct 03 15:15:00 crc kubenswrapper[4962]: E1003 15:15:00.168653 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.168662 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4962]: E1003 15:15:00.168682 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerName="gather" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.168691 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerName="gather" Oct 03 15:15:00 crc kubenswrapper[4962]: E1003 15:15:00.168706 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerName="extract-utilities" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.168713 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerName="extract-utilities" Oct 03 15:15:00 crc kubenswrapper[4962]: E1003 15:15:00.168727 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="extract-content" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.168735 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="extract-content" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.169015 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerName="copy" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.169032 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9114c6-38f7-449a-a4e9-44cd399cdaac" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.169042 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ab4273-9bb0-4395-bb9b-f17637594e1d" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.169056 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="12468d83-dfda-49d1-8184-efe2d4226e3c" containerName="gather" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.169980 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.172712 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.172824 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.181215 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc"] Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.278789 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbfff\" (UniqueName: \"kubernetes.io/projected/0bd05fc3-2236-4135-b452-e8700b53e0fc-kube-api-access-cbfff\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.279948 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bd05fc3-2236-4135-b452-e8700b53e0fc-secret-volume\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.280147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bd05fc3-2236-4135-b452-e8700b53e0fc-config-volume\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.382544 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bd05fc3-2236-4135-b452-e8700b53e0fc-config-volume\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.382701 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbfff\" (UniqueName: \"kubernetes.io/projected/0bd05fc3-2236-4135-b452-e8700b53e0fc-kube-api-access-cbfff\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.382828 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bd05fc3-2236-4135-b452-e8700b53e0fc-secret-volume\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.383831 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bd05fc3-2236-4135-b452-e8700b53e0fc-config-volume\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.391343 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bd05fc3-2236-4135-b452-e8700b53e0fc-secret-volume\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.398140 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbfff\" (UniqueName: \"kubernetes.io/projected/0bd05fc3-2236-4135-b452-e8700b53e0fc-kube-api-access-cbfff\") pod \"collect-profiles-29325075-278dc\" (UID: \"0bd05fc3-2236-4135-b452-e8700b53e0fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.502278 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" Oct 03 15:15:00 crc kubenswrapper[4962]: I1003 15:15:00.954619 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc"] Oct 03 15:15:01 crc kubenswrapper[4962]: I1003 15:15:01.083106 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-278dc" event={"ID":"0bd05fc3-2236-4135-b452-e8700b53e0fc","Type":"ContainerStarted","Data":"16a7840e0a500cff54a9d3c28b8bc4a555e308286bf73d69cbee6771ad3f6a07"} Oct 03 15:15:01 crc kubenswrapper[4962]: I1003 15:15:01.227795 4962 scope.go:117] "RemoveContainer" containerID="9f15ec2ae1359afce828df84095ed43d5bdf9f8fbc6133a7315f8997b8a760e5" Oct 03 15:15:01 crc kubenswrapper[4962]: E1003 15:15:01.228060 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-46vck_openshift-machine-config-operator(e40a27aa-682e-4b25-a198-8054ba9f2477)\"" pod="openshift-machine-config-operator/machine-config-daemon-46vck" podUID="e40a27aa-682e-4b25-a198-8054ba9f2477"